Audiovisual: A Short History of the LED Large Area Video ...

06 May.,2024

 

Audiovisual: A Short History of the LED Large Area Video ...

In late 2009

Robert Simpson

Want more information on video wall processors? Feel free to contact us.

wrote an article for Blooloop entitled “A short history of the videowall”. It summarized the 30 year history of videowalls made up by ganging together multiple displays based on various technologies. This article is a “companion piece”, which describes the emergence of the large area display, initially as used in places like sports arenas, but now finding its way into many applications formerly served by the “traditional” videowall.

Robert Simpson is Founder Director of Electrosonic, and a holder of the INFOCOMM Distinguished Achievement Award. Note – some of the “historical” pictures in this article can also be found in his 1997 book “Videowalls – the Book of the Big Electronic Image”.

Ever since the invention of the light bulb people have sought to create moving images by using lots of separate lightsources. The problems included finding lightsources that could be switched fast enough to give realistic motion effects, and finding the technical means to do the switching.

By Robert Simpson.

An early manifestation was the “ticker” display. In 1928 such a display was installed in Times Square to display news in running text form, and became known as a “zipper”. It consisted of a ribbon array of incandescent lamps which presented sideways scrolling text.

A typical way of achieving this was to use a mercury bath. A travelling tape, like a pianola roll, and punched with holes corresponding to the required letters, would be drawn across the mercury bath. Resting on the tape were hundreds or thousands of spring contacts, one for each lamp in the array; when a hole passed beneath them contact was made through the mercury and the corresponding bulb would light. This method was used right up to the 1960s –  for example when the rotunda at Birmingham’s Bullring was built it had a display driven this way. It is said that operators of this kind of system suffered from the effects of mercury poisoning.

More from Robert Simpson:  Aspects of EXPO 2010 -an Audiovisual Review /  World's Fairs: Expo Beginnings /  World's Fairs: EXPOs in the 20th and 21st Centuries

Another method, used on Broadway, allowed the showing of “movie” images. 35mm high contrast black and white film was projected on to an array of photo-cells, each cell connected to a lamp. The images were low resolution, and the frame rate was low because of the thermal inertia of the lamps. Later on sign companies developed electronic systems that could drive incandescent lamp matrices. These worked well but the resulting display was still limited by the incandescent lamp itself. The Young Electric Sign Company (YESCO) was an expert in this field and its crowning achievement was the engineering of the Fremont Street Experience which opened in December 1995.

 The Fremont Street Experience. Above right the original light bulb installation from 1995, and on the left the LED system installed in 2004.

 The Fremont Street Experience was initially conceived by Jon Jerde, but developed into a more practical form by Mary Koslowski. It involved building a canopy over a four-block long length of street. Under the canopy was an array of 2.1 million light bulbs arranged in four lamp groups to provide colour pixels. The whole display was 1400ft long and 90ft wide (427m × 27m) and was 90ft (27m) above street level. It used more than 120 PCs to drive it. The display gave free shows in the evening and was instrumental in bringing more punters to the downtown area of Las Vegas. It is still a major attraction, but the display was updated to use LED (12.5 million LEDs!) in 2004. The new system used High Definition video as a source from the outset.
 
The “modern” era of large area displays started in the 1980s. At the time electronics had developed to the point where the control of a big video display could be achieved, but  the choice of lighting element was not at all obvious. The requirement was for a performance which was far beyond that of the incandescent lamp and required:

–    a luminance high enough to work in bright daylight, up to around 5, 000 nit (Candelas per square metre). For comparison a typical TV screen would give around 400nit.
–    the ability to show colour and grayscale at video frame rates (implying the need for fast switching).

Several different approaches were tried. An obvious contender was the fluorescent lamp. These can be made very small, and if the right phosphors are used can switch on and off very fast. By the 1990s many large fluorescent lamp based displays had been installed by companies like Toshiba (with GiantVision™) and Panasonic (with Astrovision™ – a trademark which is still used for its LED displays). The advantages claimed for this type of display included comparatively long life with better luminance maintenance, and the use of low voltage components reducing the problems of dust attraction and moisture associated with high voltage devices. Its disadvantages were that modulation to achieve different light levels was difficult, especially at low light levels, and that in cold weather local heating of the lamps was needed.


 Above – Back and front view of the display modules (left) and a complete Toshiba GiantVision display based on fluorescent lamp technology. Osaka Nagai Athletic Stadium May 1996.

Most large area displays use emissive light sources, but it also possible to use transmissive technology where a constant light source is modulated. When Liquid Crystal Displays (LCD) started to appear as low resolution alphanumeric information displays, it was not long before companies began to exploit the idea of a liquid crystal shutter to control an individual pixel. A typical product was that produced by Lunar A/S of Norway (the same product also appeared under Philips branding). It consisted of 1.3 sq.ft. (0.13 sq.m.) LCD tiles, each carrying 256 three colour pixels. Colour was achieved using filters. 

Two such tiles fitted in to a lighting module, and four modules fitted into a transportable case suitable for stacking. The lighting module was fitted with asymmetric reflectors and diffusers to ensure even illumination. To achieve an evenly illuminated image required stable operating conditions which meant that both the LCD tiles and the lighting system needed independent air conditioning systems.

Left – The Lunar A/S big screen LCD video display installed at the Holmenkollen Skii Jump near Oslo in Norway.

Below right – the main components of the display; two side by side LCD panels at the top, and a lighting unit underneath. Ca 1995.

The most successful technology in the pre-LED era was the Cathode Ray Tube (CRT) flood gun. A leader in its application was Mitsubishi in Nagasaki which started producing its DiamondVision™ displays as far back as 1980. This was a curious story in that it was a diversification move for Mitsubishi’s shipyards which had run out of work; so electrical and electronic engineers who had been building ship’s bridge systems etc found themselves building video displays for baseball stadiums.

The flood gun tube is a miniature CRT which does not produce an image, but whose whole front face lights up when the electron beam is “on”. This has the big advantages that the lightsources could now be switched as quickly as TV images, and the colour produced by the display exactly matches the familiar TV set. Mitsubishi had its devices made by Noritake Itron (a manufacturer of vacuum fluorescent displays).

Sony entered the race in 1985. It did so in style with the launch of the Jumbotron™. The first outing was at EXPO 85 in Tsukuba in Japan where Sony installed a Jumbotron with a massive 45m diagonal screen. Sony’s display devices were made by Futaba (another manufacturer of vacuum fluorescent displays).

A smaller player was English Electric in the UK. It had the knowhow and the facilities to manufacture its own display devices, and its product was called StarVision. Although it was well regarded, and was used by hire companies, it never received the ongoing investment that it needed to stay in the market.


Above – The Sony Jumbotron at EXPO85, with nearly 1000 sq.m. of screen (over 10, 000 sq.ft.) it could be seen at a great distance, and a Sumo Wrestler appearing on the horizon was a terrifying sight!

Audiovisual: A Short History of the LED Large Area Video ...

In late 2009

Robert Simpson

wrote an article for Blooloop entitled “A short history of the videowall”. It summarized the 30 year history of videowalls made up by ganging together multiple displays based on various technologies. This article is a “companion piece”, which describes the emergence of the large area display, initially as used in places like sports arenas, but now finding its way into many applications formerly served by the “traditional” videowall.

Robert Simpson is Founder Director of Electrosonic, and a holder of the INFOCOMM Distinguished Achievement Award. Note – some of the “historical” pictures in this article can also be found in his 1997 book “Videowalls – the Book of the Big Electronic Image”.

Ever since the invention of the light bulb people have sought to create moving images by using lots of separate lightsources. The problems included finding lightsources that could be switched fast enough to give realistic motion effects, and finding the technical means to do the switching.

By Robert Simpson.

An early manifestation was the “ticker” display. In 1928 such a display was installed in Times Square to display news in running text form, and became known as a “zipper”. It consisted of a ribbon array of incandescent lamps which presented sideways scrolling text.

A typical way of achieving this was to use a mercury bath. A travelling tape, like a pianola roll, and punched with holes corresponding to the required letters, would be drawn across the mercury bath. Resting on the tape were hundreds or thousands of spring contacts, one for each lamp in the array; when a hole passed beneath them contact was made through the mercury and the corresponding bulb would light. This method was used right up to the 1960s –  for example when the rotunda at Birmingham’s Bullring was built it had a display driven this way. It is said that operators of this kind of system suffered from the effects of mercury poisoning.

More from Robert Simpson:  Aspects of EXPO 2010 -an Audiovisual Review /  World's Fairs: Expo Beginnings /  World's Fairs: EXPOs in the 20th and 21st Centuries

Another method, used on Broadway, allowed the showing of “movie” images. 35mm high contrast black and white film was projected on to an array of photo-cells, each cell connected to a lamp. The images were low resolution, and the frame rate was low because of the thermal inertia of the lamps. Later on sign companies developed electronic systems that could drive incandescent lamp matrices. These worked well but the resulting display was still limited by the incandescent lamp itself. The Young Electric Sign Company (YESCO) was an expert in this field and its crowning achievement was the engineering of the Fremont Street Experience which opened in December 1995.

 The Fremont Street Experience. Above right the original light bulb installation from 1995, and on the left the LED system installed in 2004.

 The Fremont Street Experience was initially conceived by Jon Jerde, but developed into a more practical form by Mary Koslowski. It involved building a canopy over a four-block long length of street. Under the canopy was an array of 2.1 million light bulbs arranged in four lamp groups to provide colour pixels. The whole display was 1400ft long and 90ft wide (427m × 27m) and was 90ft (27m) above street level. It used more than 120 PCs to drive it. The display gave free shows in the evening and was instrumental in bringing more punters to the downtown area of Las Vegas. It is still a major attraction, but the display was updated to use LED (12.5 million LEDs!) in 2004. The new system used High Definition video as a source from the outset.
 
The “modern” era of large area displays started in the 1980s. At the time electronics had developed to the point where the control of a big video display could be achieved, but  the choice of lighting element was not at all obvious. The requirement was for a performance which was far beyond that of the incandescent lamp and required:

–    a luminance high enough to work in bright daylight, up to around 5, 000 nit (Candelas per square metre). For comparison a typical TV screen would give around 400nit.
–    the ability to show colour and grayscale at video frame rates (implying the need for fast switching).

Several different approaches were tried. An obvious contender was the fluorescent lamp. These can be made very small, and if the right phosphors are used can switch on and off very fast. By the 1990s many large fluorescent lamp based displays had been installed by companies like Toshiba (with GiantVision™) and Panasonic (with Astrovision™ – a trademark which is still used for its LED displays). The advantages claimed for this type of display included comparatively long life with better luminance maintenance, and the use of low voltage components reducing the problems of dust attraction and moisture associated with high voltage devices. Its disadvantages were that modulation to achieve different light levels was difficult, especially at low light levels, and that in cold weather local heating of the lamps was needed.


 Above – Back and front view of the display modules (left) and a complete Toshiba GiantVision display based on fluorescent lamp technology. Osaka Nagai Athletic Stadium May 1996.

Most large area displays use emissive light sources, but it also possible to use transmissive technology where a constant light source is modulated. When Liquid Crystal Displays (LCD) started to appear as low resolution alphanumeric information displays, it was not long before companies began to exploit the idea of a liquid crystal shutter to control an individual pixel. A typical product was that produced by Lunar A/S of Norway (the same product also appeared under Philips branding). It consisted of 1.3 sq.ft. (0.13 sq.m.) LCD tiles, each carrying 256 three colour pixels. Colour was achieved using filters. 

Two such tiles fitted in to a lighting module, and four modules fitted into a transportable case suitable for stacking. The lighting module was fitted with asymmetric reflectors and diffusers to ensure even illumination. To achieve an evenly illuminated image required stable operating conditions which meant that both the LCD tiles and the lighting system needed independent air conditioning systems.

Left – The Lunar A/S big screen LCD video display installed at the Holmenkollen Skii Jump near Oslo in Norway.

Below right – the main components of the display; two side by side LCD panels at the top, and a lighting unit underneath. Ca 1995.

The most successful technology in the pre-LED era was the Cathode Ray Tube (CRT) flood gun. A leader in its application was Mitsubishi in Nagasaki which started producing its DiamondVision™ displays as far back as 1980. This was a curious story in that it was a diversification move for Mitsubishi’s shipyards which had run out of work; so electrical and electronic engineers who had been building ship’s bridge systems etc found themselves building video displays for baseball stadiums.

The flood gun tube is a miniature CRT which does not produce an image, but whose whole front face lights up when the electron beam is “on”. This has the big advantages that the lightsources could now be switched as quickly as TV images, and the colour produced by the display exactly matches the familiar TV set. Mitsubishi had its devices made by Noritake Itron (a manufacturer of vacuum fluorescent displays).

Sony entered the race in 1985. It did so in style with the launch of the Jumbotron™. The first outing was at EXPO 85 in Tsukuba in Japan where Sony installed a Jumbotron with a massive 45m diagonal screen. Sony’s display devices were made by Futaba (another manufacturer of vacuum fluorescent displays).

A smaller player was English Electric in the UK. It had the knowhow and the facilities to manufacture its own display devices, and its product was called StarVision. Although it was well regarded, and was used by hire companies, it never received the ongoing investment that it needed to stay in the market.


Above – The Sony Jumbotron at EXPO85, with nearly 1000 sq.m. of screen (over 10, 000 sq.ft.) it could be seen at a great distance, and a Sumo Wrestler appearing on the horizon was a terrifying sight!

Both the original DiamondVision and the Jumbotron produced excellent pictures and were widely used, both in permanent installations (predominantly in sports stadiums) and for rental. But the big problem was display life, typically 8000 hours for the Futaba devices and up to 12000 hours for the Noritake Itron devices. While this might have been just sufficient for stadium displays that only ran a few hours a week, it made any kind of permanent installation running 12 hours a day or more horribly expensive to run.  

Lef – A sub-assembly from a Sony Jumbotron. Note that the display elements used by both Mitsubishi and Sony changed over time, and depended on the size of display. This particular module is in The Pixel Depot’s “museum”.
 
Below right – An English Electric Starvision display, used at the Oval Cricket Ground for several years. Photo from Screenco.
 

The big question today is “why did it take so long for LED to do the job?” Today we are exhorted to buy LED (Light Emitting Diode) lighting because it is so efficient, but the early LEDs from the 1960s were hopelessly inefficient. In the late 1980s Aluminium Indium Gallium Phosphide LEDs arrived and these provided an efficient source of red and amber, and were used to good effect in information displays. But it was still impossible to achieve full colour. The available “green” was hardly green at all – mostly yellow, and an early blue needed a power station to run it. It was only when Shuji Nakumura, then at Nichia Chemical, announced the development of the blue (and later green) LED based on Indium Gallium Nitride that the game was really on for big LED video displays.

Some of the early LED displays were of dubious quality because they continued to compromise on the type of LED used. In order to achieve reasonable colour and light output many more green LEDs were needed, resulting in rather odd pixel configurations. Within a few years, however, and certainly since the start of the 21st Century LED had not only taken over the existing limited market, but because of falling prices and improved lifetime and efficiency had allowed many new markets to develop.

Left – In the mid to late 1990s the LED clusters used to make up pixels might need, for example in the 40mm pitch display, nine green LEDs to every two blue and one red. Photos from Rainbow Vision.

Sports venues have represented the biggest market, carrying on from where the original DiamondVision and Jumbotron displays left off (incidentally Mitsubishi remains a formidable supplier in this field having transferred the DiamondVision name to its LED product line; but Sony has effectively retired from the field).

The pop concert was an early user of video screens and top line acts could afford to use Jumbotrons, and later LED. However the whole idea of what could be done with LED was given an early shake up by Mark Fisher’s design for U2’s “Popmart” tour of 1997. He realized that with long viewing distances wide pixel spacing could be used to achieve very large images, especially if viewed at night. The system had to be suitable for touring so an open mesh arrangement was used that could be rolled up for transport. The whole display was 52m (170ft) wide and 17m (56ft) high. It had a total of 150, 000 pixels. Amazingly the company that supplied the LED pixels and their driving system, SACO Technologies of Montreal, had never engineered a video system before, being more used to building mimic panels for powerstation control rooms.

  Right – The “Popmart” tour set. In the daytime picture the mesh arrangement carrying the pixels can be seen. Photos from SACO.

SACO went on to build the iconic NASDAQ display on Times Square, and the whole area sports acres of LED. Now LED large area displays are ubiquitous, as advertising hoardings on the highway, or digital signage in the shopping mall. Manufacture is largely based in the Far East, mostly in China. On a recent visit to Shenzen a colleague was able to find 50 different manufacturers in a morning. Not surprisingly quality varies, with Nichia LEDs still representing the “gold standard”, but it is fair to say that the variety of products now available is opening up new applications.

A prime example of how LED displays shook off their image as a gaudy advertising medium and were more than ready to be considered for high quality, high definition media presentation is the lobby display at Comcast’s Philadelphia headquarters. Installed in 2008 the display is 25.4 feet (7.7 m) high and, 83.3 feet (25.4 m) wide. It uses 10 million LEDs to cover the 2, 000 sq.ft. (190 m2) of display using 6, 771 Barco NX-4 4mm pitch LED panels The installation  was designed and produced by Niles Creative Group and runs 18 hours each day. The content is great fun, sometimes consisting of special “holiday” shows, but mainly based on imaginative animated sequences, images of space and views of Philadelphia.


 Above – The amazing lobby display at Comcast’s headquarters in Philadelphia installed in 2008. The clock is built up over an amusing animated sequence and, when complete, tells the right time.

As costs have come down and technologies have improved LED video displays are proving to be a versatile medium, not at all limited to the flat screen display. Flexible substrates have allowed the introduction of curved displays and LED “floors” have become common. They are increasingly being used as scenic elements in the theatre – for example while the production of “Spiderman- The Musical” on Broadway may have had a complicated gestation, scenic elements which at the start had been expected to be projected, had become LED by the time (about a year later) the production opened because prices had come down so much.

Left – The Pixel Depot, near Dorking (UK) is a useful facility where potential users can see 57 varieties of LED. The top image shows a dance floor display and underneath a flexible LED display (sold as “digiFLEX”) is shown applied to the curves of a grand piano.

The Summer Olympic Games 2012 in London had one of the most imaginative uses of LED. Associated with each seat a “tablet” carrying nine LEDs became a pixel within a giant display the size of the whole stadium; the resulting effect amazed the world. The concept was simple, just use a standard LED video display driving system (in this case Barco electronics) to drive widely spaced pixels. Sounds simple, but actually a serious piece of engineering by Tait Technologies who in a 14 week period engineered, built and installed 70, 000 tablets and their associated 370km (nearly 250 miles) of cabling.

Right – Two views of the “tablet” LED units used at the Summer Olympic Games in London 2012. Photos courtesy Tait Technologies.

An indication of things to come (or already here) was to be seen at this year’s (2013) Integrated Systems Europe trade show in Amsterdam. Lang, a German dry hire and professional sales company, can always be relied upon to put on a spectacular and stimulating display at ISE and in 2013 it excelled itself by installing a 4K (Quad HD 3840 × 2160) LED display7.3m × 4.1m (24ft × 13.5ft). It is an obvious, but easily overlooked, point that if a 4K projector was to project an image of the same size, the pixel pitch would be the same. Not surprisingly many visitors to ISE were looking for projectors because they could not believe it was LED. The display used 1.9mm pitch LED modules from Silicon Core Technology (SCT). In practice at this pitch pixels are not detectable at a two metre (six foot) viewing distance.

It should be noted that other vendors (e.g. Leyard) are also now offering 1.9mm pitch LED, and that SCT itself is prototyping 1.5mm pitch modules. Another point is that at screen luminances suitable for indoor uses (say 600nit and average video content) the power consumption is remarkably low, contradicting claims of other technologies that LED is very greedy on power and is expensive to run.


Above – The 4K LED display shown by Lang at ISE 2013 attracted the crowds; many thought it must be a projected image.

In under 20 years the LED large area display has come a long way, but recent developments (which include OLED, Organic Light Emitting Diodes) will expand its use into many new areas.

 

THE EVOLUTION OF VIDEOWALLS



In the 80's video projection was in its infancy and was based on cathode ray tube (CRT) technology. The biggest challenge with video projection at the time was brightness: a typical CRT projector could achieve only about 600 to 800 lumens. This meant that projected images were limited to about 4 meters in width, and required a darkened room.

Monitors and television sets also made use of CRT technology and were limited in size to about 70cm diagonally. In order to overcome these challenges innovators in the industry started "lacing" projectors or monitors together to achieve bigger images.

 

The first multi-image video displays

In 1985 the first multi-image video displays consisted of two or more projectors (or monitors) projecting side by side onto the same screen. A typical setup would consist of two videotape recorders, synchronized via time code. One time code for the left hand projector and one for the right. In order to shoot the material for this setup, two cameras mounted side by side would be used. Although other methods of splitting the image could also be used, this was the favoured method as it retained the full resolution of each image. By 1987, video tape recorders were replaced by video disc players, and by 1996 by DVD players.



The introduction of the videowall processorvideowall processor

In the mid 1980's the first true videowall processors were introduced to the audio visual world. They consisted of large racks of equipment connected by metres of multicore cables. Because they could only handle standard PAL or NTSC resolutions they had very limited functionality; typically handling only up to four simultaneous video inputs.

However, they were capable of including basic video effects and were able to freeze an image per display.



Videowall monitors

In 1985 purpose-built videowall monitors did not exist. Instead, the first videowall used modified CRT television sets as the displays. The TVs were modified to accept RGB video and fitted in custom designed sheet metal cabinets. The steel cabinets served three main purposes: they allowed the TVs to be stacked; they reduced the image to image gap; and, they provided electro-magnetic shielding.

Some of the early challenges faced by the technicians were matching the colours across the displays and aligning the images. Colour purity across individual screens was a problem, due the magnetic effect of adjacent monitors.

Needless to say, the whole system was bulky, difficult to set-up and very sensitive.

 

Video projectors

By 1985, after the introduction of the videowall processor in South Africa, events companies started making use of arrays of video projectors as displays. These usually consisted of CRT projectors mounted in custom frames projecting onto rear projection material. This was followed by purpose-built projection cubes which utilized mirrors and rigid rear projection Fresnel screens. They were also designed to stack and were specially mechanized to help with image alignment.

 

Hybrid walls

In the early years between 1985 to 1990 of video walls, video projection and multi-image slide projection were sometimes mixed. This was especially useful large numbers of video projectors were not freely available.

 

The resolution revolution

Early video walls had two main problems: resolution, and wide area flicker.

The resolution of the early videowall was based on PAL or NTSC standards. It therefore had a maximum resolution of 625 TV lines across the entire image. This made it almost impossible to see any detail when magnified across multiple displays.

Wide area flicker occurred as a result of the slow refresh rate of the image (25 frames per second). While this was slightly annoying on standard television sets, it became a real problem when displayed across a large screen, and especially so in a darkened room.

 

The graphics processor

In the early 1990's Video Graphics Array (VGA) was introduced as a computer graphic standard, and soon graphic videowall processors followed. This new "high resolution" standard (which at that stage was only 640 x 480 pixels) opened up a whole new market for video or graphic walls as they came to be known. New applications included control rooms and monitoring facilities which typically displayed graphical representations of processing plants, computer or telephone networks.

One of the advantages of the graphic processor was the ability to display different resolutions and video standards, simultaneously, on the same wall.

VGA quickly developed into SVGA, XGA and so on. Today videowall processors can display images of up to 4k (4 x HD resolution).

 

Videowall cubes

Video projectors rapidly adopted new technologies including liquid crystal display (LCD) and later digital light processing (DLP) technology. These new techniques allowed for brighter and eventually higher resolution images. This technology was soon incorporated into videowall cubes. Videowall cubes are specially designed "boxes" which house the projector, a mirror and a rear projection screen. The mirror is used to fold the light path, thereby reducing the required depth of the cube. Another important function of the cube is to exclude any extraneous light from reaching the rear of the screen. This improves the contrast of the image. A problem with these new technologies was cost of ownership; in particular the expense of lamp replacement. In order to retain uniform brightness across all displays, it was necessary to replace all the lamps at the same time. DLP projectors also required color wheel replacements.

When static, high contrast images were displayed on LCD projectors for extended periods they suffered from "image burn" or image retention. Due to this DLP was the preferred projection technology for videowall cubes.

Projection cubes are still being used today although the preferred illumination source is now light-emitting diode (LED). Due to the long lamp life of LED, this has brought the overall cost of ownership down significantly.



Flat panel displays

In 1995 flat panel television sets based on plasma technology were launched. The AV industry was soon using them as videowall displays. The main problem with using them in video walls was the wide frame or bezel surrounding them. Later on "bezel less" plasmas specifically designed for videowall were introduced. These did not produce a seamless image and were fragile. They were more suited to fixed installations, than to the rental industry.

In 2003 large LCD flat panels with a 46" screens were launched. Although they still had large bezels, the size made them viable for videowall applications. This changed in 2006 with the introduction of thin bezel LCD displays. The next few years saw ever thinner bezels with the slimmest at about 5mm image to image. The introduction of LED edge-lit and direct-lit LCD displays have made this the most popular display technology to date.

Today's videowall processor

In the second decade of the 21st century the most sophisticated videowall processors are card-based and driven by powerful computers. They are custom configured for the number of displays and the number and type of source. Usually the inputs are hardwired but can also be decoded from an IP stream, using built-in decoders in the processor.

Most dedicated videowall displays include scalers and daisy chain inputs. This allows a video input to be daisy-chained through the monitors, with onboard software displaying a portion of the image. In this way a large image can be displayed across multiple monitors. However, this technology is limited to a single input, with little or no effects.

A software based distributed system is also available, but requires a computer per display, which is often fitted in an optional slot in the display. This setup allows for multiple images to be displayed across the display.

Videowall remains an important medium for digital signage and control room applications. They are most commonly found in military, communication, surveillance and advertising applications.



The first videowall in South Africa

I was privileged to be involved with the installation of, what I believe, the first true videowall in South Africa, and one of the first in the world.

In the mid 1980's Electrosonic UK started developing a videowall processing system which was to become known the Picbloc system. PIC was an abbreviation for programmable image controller. This new product line incorporated a "new generation of large scale integrated circuits".

Johann Kruger, owner of Multivisio, was the first person to invest in videowall technology in South Africa. Upon hearing of this new technology, he travelled to Electrosonic in the UK to see what all the hype was about. After the demonstration of the prototype he was so impressed, that even though the product was still in the development phase, he immediately placed an order for a system for an upcoming product launch.

Lourie Coetzee, who was the owner of Twin Imports and the exclusive distributor of Electrosonic products arranged the importation and logistics of this equipment. I was employed by Twin Imports as a technician, and was responsible for the technical aspects of the project. The equipment arrived and consisted of flight cases populated with 2U rack mount boxes. Each video input required a digitiser which was housed in a 19" 3U cabinet. Likewise each video output required a similar box which was called a PicBloc. Each video input required a data bus consisting of a multicore cable linking the digitiser to the first and subsequent PicBloc.

It was a nightmare to setup, with frequent firmware updates. New EPROMS were shipped via courier and had to be physically replaced in each PicBloc. The modified TV sets were sourced locally and prone to magnetic interference from adjacent sets. Gaffer tape was used to insulate each TV from the next to avoid eddy currents. A lot of tweaking was required to get the monitors displaying a uniform color, but, after many late nights, the wall was finally ready for the product launch. It was a great success and Multivisio went on to do some of the most memorable product launches in South Africa to date, often using videowall technology.

Bruce Genricks

In the 80's video projection was in its infancy and was based on cathode ray tube (CRT) technology. The biggest challenge with video projection at the time was brightness: a typical CRT projector could achieve only about 600 to 800 lumens. This meant that projected images were limited to about 4 meters in width, and required a darkened room.Monitors and television sets also made use of CRT technology and were limited in size to about 70cm diagonally. In order to overcome these challenges innovators in the industry started "lacing" projectors or monitors together to achieve bigger images.In 1985 the first multi-image video displays consisted of two or more projectors (or monitors) projecting side by side onto the same screen. A typical setup would consist of two videotape recorders, synchronized via time code. One time code for the left hand projector and one for the right. In order to shoot the material for this setup, two cameras mounted side by side would be used. Although other methods of splitting the image could also be used, this was the favoured method as it retained the full resolution of each image. By 1987, video tape recorders were replaced by video disc players, and by 1996 by DVD players.In the mid 1980's the first true videowall processors were introduced to the audio visual world. They consisted of large racks of equipment connected by metres of multicore cables. Because they could only handle standard PAL or NTSC resolutions they had very limited functionality; typically handling only up to four simultaneous video inputs.However, they were capable of including basic video effects and were able to freeze an image per display.In 1985 purpose-built videowall monitors did not exist. Instead, the first videowall used modified CRT television sets as the displays. The TVs were modified to accept RGB video and fitted in custom designed sheet metal cabinets. The steel cabinets served three main purposes: they allowed the TVs to be stacked; they reduced the image to image gap; and, they provided electro-magnetic shielding.Some of the early challenges faced by the technicians were matching the colours across the displays and aligning the images. Colour purity across individual screens was a problem, due the magnetic effect of adjacent monitors.Needless to say, the whole system was bulky, difficult to set-up and very sensitive.By 1985, after the introduction of the videowall processor in South Africa, events companies started making use of arrays of video projectors as displays. These usually consisted of CRT projectors mounted in custom frames projecting onto rear projection material. This was followed by purpose-built projection cubes which utilized mirrors and rigid rear projection Fresnel screens. They were also designed to stack and were specially mechanized to help with image alignment.In the early years between 1985 to 1990 of video walls, video projection and multi-image slide projection were sometimes mixed. This was especially useful large numbers of video projectors were not freely available.Early video walls had two main problems: resolution, and wide area flicker.The resolution of the early videowall was based on PAL or NTSC standards. It therefore had a maximum resolution of 625 TV lines across the entire image. This made it almost impossible to see any detail when magnified across multiple displays.Wide area flicker occurred as a result of the slow refresh rate of the image (25 frames per second). While this was slightly annoying on standard television sets, it became a real problem when displayed across a large screen, and especially so in a darkened room.In the early 1990's Video Graphics Array (VGA) was introduced as a computer graphic standard, and soon graphic videowall processors followed. This new "high resolution" standard (which at that stage was only 640 x 480 pixels) opened up a whole new market for video or graphic walls as they came to be known. New applications included control rooms and monitoring facilities which typically displayed graphical representations of processing plants, computer or telephone networks.One of the advantages of the graphic processor was the ability to display different resolutions and video standards, simultaneously, on the same wall.VGA quickly developed into SVGA, XGA and so on. Today videowall processors can display images of up to 4k (4 x HD resolution).Video projectors rapidly adopted new technologies including liquid crystal display (LCD) and later digital light processing (DLP) technology. These new techniques allowed for brighter and eventually higher resolution images. This technology was soon incorporated into videowall cubes. Videowall cubes are specially designed "boxes" which house the projector, a mirror and a rear projection screen. The mirror is used to fold the light path, thereby reducing the required depth of the cube. Another important function of the cube is to exclude any extraneous light from reaching the rear of the screen. This improves the contrast of the image. A problem with these new technologies was cost of ownership; in particular the expense of lamp replacement. In order to retain uniform brightness across all displays, it was necessary to replace all the lamps at the same time. DLP projectors also required color wheel replacements.When static, high contrast images were displayed on LCD projectors for extended periods they suffered from "image burn" or image retention. Due to this DLP was the preferred projection technology for videowall cubes.Projection cubes are still being used today although the preferred illumination source is now light-emitting diode (LED). Due to the long lamp life of LED, this has brought the overall cost of ownership down significantly.In 1995 flat panel television sets based on plasma technology were launched. The AV industry was soon using them as videowall displays. The main problem with using them in video walls was the wide frame or bezel surrounding them. Later on "bezel less" plasmas specifically designed for videowall were introduced. These did not produce a seamless image and were fragile. They were more suited to fixed installations, than to the rental industry.In 2003 large LCD flat panels with a 46" screens were launched. Although they still had large bezels, the size made them viable for videowall applications. This changed in 2006 with the introduction of thin bezel LCD displays. The next few years saw ever thinner bezels with the slimmest at about 5mm image to image. The introduction of LED edge-lit and direct-lit LCD displays have made this the most popular display technology to date.In the second decade of the 21st century the most sophisticated videowall processors are card-based and driven by powerful computers. They are custom configured for the number of displays and the number and type of source. Usually the inputs are hardwired but can also be decoded from an IP stream, using built-in decoders in the processor.Most dedicated videowall displays include scalers and daisy chain inputs. This allows a video input to be daisy-chained through the monitors, with onboard software displaying a portion of the image. In this way a large image can be displayed across multiple monitors. However, this technology is limited to a single input, with little or no effects.A software based distributed system is also available, but requires a computer per display, which is often fitted in an optional slot in the display. This setup allows for multiple images to be displayed across the display.Videowall remains an important medium for digital signage and control room applications. They are most commonly found in military, communication, surveillance and advertising applications.I was privileged to be involved with the installation of, what I believe, the first true videowall in South Africa, and one of the first in the world.In the mid 1980's Electrosonic UK started developing a videowall processing system which was to become known the Picbloc system. PIC was an abbreviation for programmable image controller. This new product line incorporated a "new generation of large scale integrated circuits".Johann Kruger, owner of Multivisio, was the first person to invest in videowall technology in South Africa. Upon hearing of this new technology, he travelled to Electrosonic in the UK to see what all the hype was about. After the demonstration of the prototype he was so impressed, that even though the product was still in the development phase, he immediately placed an order for a system for an upcoming product launch.Lourie Coetzee, who was the owner of Twin Imports and the exclusive distributor of Electrosonic products arranged the importation and logistics of this equipment. I was employed by Twin Imports as a technician, and was responsible for the technical aspects of the project. The equipment arrived and consisted of flight cases populated with 2U rack mount boxes. Each video input required a digitiser which was housed in a 19" 3U cabinet. Likewise each video output required a similar box which was called a PicBloc. Each video input required a data bus consisting of a multicore cable linking the digitiser to the first and subsequent PicBloc.It was a nightmare to setup, with frequent firmware updates. New EPROMS were shipped via courier and had to be physically replaced in each PicBloc. The modified TV sets were sourced locally and prone to magnetic interference from adjacent sets. Gaffer tape was used to insulate each TV from the next to avoid eddy currents. A lot of tweaking was required to get the monitors displaying a uniform color, but, after many late nights, the wall was finally ready for the product launch. It was a great success and Multivisio went on to do some of the most memorable product launches in South Africa to date, often using videowall technology.

Both the original DiamondVision and the Jumbotron produced excellent pictures and were widely used, both in permanent installations (predominantly in sports stadiums) and for rental. But the big problem was display life, typically 8000 hours for the Futaba devices and up to 12000 hours for the Noritake Itron devices. While this might have been just sufficient for stadium displays that only ran a few hours a week, it made any kind of permanent installation running 12 hours a day or more horribly expensive to run.  

Lef – A sub-assembly from a Sony Jumbotron. Note that the display elements used by both Mitsubishi and Sony changed over time, and depended on the size of display. This particular module is in The Pixel Depot’s “museum”.
 
Below right – An English Electric Starvision display, used at the Oval Cricket Ground for several years. Photo from Screenco.
 

The big question today is “why did it take so long for LED to do the job?” Today we are exhorted to buy LED (Light Emitting Diode) lighting because it is so efficient, but the early LEDs from the 1960s were hopelessly inefficient. In the late 1980s Aluminium Indium Gallium Phosphide LEDs arrived and these provided an efficient source of red and amber, and were used to good effect in information displays. But it was still impossible to achieve full colour. The available “green” was hardly green at all – mostly yellow, and an early blue needed a power station to run it. It was only when Shuji Nakumura, then at Nichia Chemical, announced the development of the blue (and later green) LED based on Indium Gallium Nitride that the game was really on for big LED video displays.

Some of the early LED displays were of dubious quality because they continued to compromise on the type of LED used. In order to achieve reasonable colour and light output many more green LEDs were needed, resulting in rather odd pixel configurations. Within a few years, however, and certainly since the start of the 21st Century LED had not only taken over the existing limited market, but because of falling prices and improved lifetime and efficiency had allowed many new markets to develop.

Left – In the mid to late 1990s the LED clusters used to make up pixels might need, for example in the 40mm pitch display, nine green LEDs to every two blue and one red. Photos from Rainbow Vision.

Sports venues have represented the biggest market, carrying on from where the original DiamondVision and Jumbotron displays left off (incidentally Mitsubishi remains a formidable supplier in this field having transferred the DiamondVision name to its LED product line; but Sony has effectively retired from the field).

The pop concert was an early user of video screens and top line acts could afford to use Jumbotrons, and later LED. However the whole idea of what could be done with LED was given an early shake up by Mark Fisher’s design for U2’s “Popmart” tour of 1997. He realized that with long viewing distances wide pixel spacing could be used to achieve very large images, especially if viewed at night. The system had to be suitable for touring so an open mesh arrangement was used that could be rolled up for transport. The whole display was 52m (170ft) wide and 17m (56ft) high. It had a total of 150, 000 pixels. Amazingly the company that supplied the LED pixels and their driving system, SACO Technologies of Montreal, had never engineered a video system before, being more used to building mimic panels for powerstation control rooms.

  Right – The “Popmart” tour set. In the daytime picture the mesh arrangement carrying the pixels can be seen. Photos from SACO.

SACO went on to build the iconic NASDAQ display on Times Square, and the whole area sports acres of LED. Now LED large area displays are ubiquitous, as advertising hoardings on the highway, or digital signage in the shopping mall. Manufacture is largely based in the Far East, mostly in China. On a recent visit to Shenzen a colleague was able to find 50 different manufacturers in a morning. Not surprisingly quality varies, with Nichia LEDs still representing the “gold standard”, but it is fair to say that the variety of products now available is opening up new applications.

A prime example of how LED displays shook off their image as a gaudy advertising medium and were more than ready to be considered for high quality, high definition media presentation is the lobby display at Comcast’s Philadelphia headquarters. Installed in 2008 the display is 25.4 feet (7.7 m) high and, 83.3 feet (25.4 m) wide. It uses 10 million LEDs to cover the 2, 000 sq.ft. (190 m2) of display using 6, 771 Barco NX-4 4mm pitch LED panels The installation  was designed and produced by Niles Creative Group and runs 18 hours each day. The content is great fun, sometimes consisting of special “holiday” shows, but mainly based on imaginative animated sequences, images of space and views of Philadelphia.


 Above – The amazing lobby display at Comcast’s headquarters in Philadelphia installed in 2008. The clock is built up over an amusing animated sequence and, when complete, tells the right time.

As costs have come down and technologies have improved LED video displays are proving to be a versatile medium, not at all limited to the flat screen display. Flexible substrates have allowed the introduction of curved displays and LED “floors” have become common. They are increasingly being used as scenic elements in the theatre – for example while the production of “Spiderman- The Musical” on Broadway may have had a complicated gestation, scenic elements which at the start had been expected to be projected, had become LED by the time (about a year later) the production opened because prices had come down so much.

Left – The Pixel Depot, near Dorking (UK) is a useful facility where potential users can see 57 varieties of LED. The top image shows a dance floor display and underneath a flexible LED display (sold as “digiFLEX”) is shown applied to the curves of a grand piano.

The Summer Olympic Games 2012 in London had one of the most imaginative uses of LED. Associated with each seat a “tablet” carrying nine LEDs became a pixel within a giant display the size of the whole stadium; the resulting effect amazed the world. The concept was simple, just use a standard LED video display driving system (in this case Barco electronics) to drive widely spaced pixels. Sounds simple, but actually a serious piece of engineering by Tait Technologies who in a 14 week period engineered, built and installed 70, 000 tablets and their associated 370km (nearly 250 miles) of cabling.

Right – Two views of the “tablet” LED units used at the Summer Olympic Games in London 2012. Photos courtesy Tait Technologies.

An indication of things to come (or already here) was to be seen at this year’s (2013) Integrated Systems Europe trade show in Amsterdam. Lang, a German dry hire and professional sales company, can always be relied upon to put on a spectacular and stimulating display at ISE and in 2013 it excelled itself by installing a 4K (Quad HD 3840 × 2160) LED display7.3m × 4.1m (24ft × 13.5ft). It is an obvious, but easily overlooked, point that if a 4K projector was to project an image of the same size, the pixel pitch would be the same. Not surprisingly many visitors to ISE were looking for projectors because they could not believe it was LED. The display used 1.9mm pitch LED modules from Silicon Core Technology (SCT). In practice at this pitch pixels are not detectable at a two metre (six foot) viewing distance.

It should be noted that other vendors (e.g. Leyard) are also now offering 1.9mm pitch LED, and that SCT itself is prototyping 1.5mm pitch modules. Another point is that at screen luminances suitable for indoor uses (say 600nit and average video content) the power consumption is remarkably low, contradicting claims of other technologies that LED is very greedy on power and is expensive to run.


Above – The 4K LED display shown by Lang at ISE 2013 attracted the crowds; many thought it must be a projected image.

In under 20 years the LED large area display has come a long way, but recent developments (which include OLED, Organic Light Emitting Diodes) will expand its use into many new areas.

 

THE EVOLUTION OF VIDEOWALLS



In the 80's video projection was in its infancy and was based on cathode ray tube (CRT) technology. The biggest challenge with video projection at the time was brightness: a typical CRT projector could achieve only about 600 to 800 lumens. This meant that projected images were limited to about 4 meters in width, and required a darkened room.

Monitors and television sets also made use of CRT technology and were limited in size to about 70cm diagonally. In order to overcome these challenges innovators in the industry started "lacing" projectors or monitors together to achieve bigger images.

 

The first multi-image video displays

In 1985 the first multi-image video displays consisted of two or more projectors (or monitors) projecting side by side onto the same screen. A typical setup would consist of two videotape recorders, synchronized via time code. One time code for the left hand projector and one for the right. In order to shoot the material for this setup, two cameras mounted side by side would be used. Although other methods of splitting the image could also be used, this was the favoured method as it retained the full resolution of each image. By 1987, video tape recorders were replaced by video disc players, and by 1996 by DVD players.



The introduction of the videowall processor

In the mid 1980's the first true videowall processors were introduced to the audio visual world. They consisted of large racks of equipment connected by metres of multicore cables. Because they could only handle standard PAL or NTSC resolutions they had very limited functionality; typically handling only up to four simultaneous video inputs.

However, they were capable of including basic video effects and were able to freeze an image per display.



Videowall monitors

In 1985 purpose-built videowall monitors did not exist. Instead, the first videowall used modified CRT television sets as the displays. The TVs were modified to accept RGB video and fitted in custom designed sheet metal cabinets. The steel cabinets served three main purposes: they allowed the TVs to be stacked; they reduced the image to image gap; and, they provided electro-magnetic shielding.

Some of the early challenges faced by the technicians were matching the colours across the displays and aligning the images. Colour purity across individual screens was a problem, due the magnetic effect of adjacent monitors.

Needless to say, the whole system was bulky, difficult to set-up and very sensitive.

 

Video projectors

By 1985, after the introduction of the videowall processor in South Africa, events companies started making use of arrays of video projectors as displays. These usually consisted of CRT projectors mounted in custom frames projecting onto rear projection material. This was followed by purpose-built projection cubes which utilized mirrors and rigid rear projection Fresnel screens. They were also designed to stack and were specially mechanized to help with image alignment.

 

Hybrid walls

In the early years between 1985 to 1990 of video walls, video projection and multi-image slide projection were sometimes mixed. This was especially useful large numbers of video projectors were not freely available.

 

The resolution revolution

Early video walls had two main problems: resolution, and wide area flicker.

The resolution of the early videowall was based on PAL or NTSC standards. It therefore had a maximum resolution of 625 TV lines across the entire image. This made it almost impossible to see any detail when magnified across multiple displays.

Wide area flicker occurred as a result of the slow refresh rate of the image (25 frames per second). While this was slightly annoying on standard television sets, it became a real problem when displayed across a large screen, and especially so in a darkened room.

 

The graphics processor

In the early 1990's Video Graphics Array (VGA) was introduced as a computer graphic standard, and soon graphic videowall processors followed. This new "high resolution" standard (which at that stage was only 640 x 480 pixels) opened up a whole new market for video or graphic walls as they came to be known. New applications included control rooms and monitoring facilities which typically displayed graphical representations of processing plants, computer or telephone networks.

One of the advantages of the graphic processor was the ability to display different resolutions and video standards, simultaneously, on the same wall.

VGA quickly developed into SVGA, XGA and so on. Today videowall processors can display images of up to 4k (4 x HD resolution).

 

Videowall cubes

Video projectors rapidly adopted new technologies including liquid crystal display (LCD) and later digital light processing (DLP) technology. These new techniques allowed for brighter and eventually higher resolution images. This technology was soon incorporated into videowall cubes. Videowall cubes are specially designed "boxes" which house the projector, a mirror and a rear projection screen. The mirror is used to fold the light path, thereby reducing the required depth of the cube. Another important function of the cube is to exclude any extraneous light from reaching the rear of the screen. This improves the contrast of the image. A problem with these new technologies was cost of ownership; in particular the expense of lamp replacement. In order to retain uniform brightness across all displays, it was necessary to replace all the lamps at the same time. DLP projectors also required color wheel replacements.

When static, high contrast images were displayed on LCD projectors for extended periods they suffered from "image burn" or image retention. Due to this DLP was the preferred projection technology for videowall cubes.

Projection cubes are still being used today although the preferred illumination source is now light-emitting diode (LED). Due to the long lamp life of LED, this has brought the overall cost of ownership down significantly.



Flat panel displays

In 1995 flat panel television sets based on plasma technology were launched. The AV industry was soon using them as videowall displays. The main problem with using them in video walls was the wide frame or bezel surrounding them. Later on "bezel less" plasmas specifically designed for videowall were introduced. These did not produce a seamless image and were fragile. They were more suited to fixed installations, than to the rental industry.

In 2003 large LCD flat panels with a 46" screens were launched. Although they still had large bezels, the size made them viable for videowall applications. This changed in 2006 with the introduction of thin bezel LCD displays. The next few years saw ever thinner bezels with the slimmest at about 5mm image to image. The introduction of LED edge-lit and direct-lit LCD displays have made this the most popular display technology to date.

Today's videowall processor

In the second decade of the 21st century the most sophisticated videowall processors are card-based and driven by powerful computers. They are custom configured for the number of displays and the number and type of source. Usually the inputs are hardwired but can also be decoded from an IP stream, using built-in decoders in the processor.

Most dedicated videowall displays include scalers and daisy chain inputs. This allows a video input to be daisy-chained through the monitors, with onboard software displaying a portion of the image. In this way a large image can be displayed across multiple monitors. However, this technology is limited to a single input, with little or no effects.

A software based distributed system is also available, but requires a computer per display, which is often fitted in an optional slot in the display. This setup allows for multiple images to be displayed across the display.

Videowall remains an important medium for digital signage and control room applications. They are most commonly found in military, communication, surveillance and advertising applications.



The first videowall in South Africa

I was privileged to be involved with the installation of, what I believe, the first true videowall in South Africa, and one of the first in the world.

In the mid 1980's Electrosonic UK started developing a videowall processing system which was to become known the Picbloc system. PIC was an abbreviation for programmable image controller. This new product line incorporated a "new generation of large scale integrated circuits".

Johann Kruger, owner of Multivisio, was the first person to invest in videowall technology in South Africa. Upon hearing of this new technology, he travelled to Electrosonic in the UK to see what all the hype was about. After the demonstration of the prototype he was so impressed, that even though the product was still in the development phase, he immediately placed an order for a system for an upcoming product launch.

Lourie Coetzee, who was the owner of Twin Imports and the exclusive distributor of Electrosonic products arranged the importation and logistics of this equipment. I was employed by Twin Imports as a technician, and was responsible for the technical aspects of the project. The equipment arrived and consisted of flight cases populated with 2U rack mount boxes. Each video input required a digitiser which was housed in a 19" 3U cabinet. Likewise each video output required a similar box which was called a PicBloc. Each video input required a data bus consisting of a multicore cable linking the digitiser to the first and subsequent PicBloc.

It was a nightmare to setup, with frequent firmware updates. New EPROMS were shipped via courier and had to be physically replaced in each PicBloc. The modified TV sets were sourced locally and prone to magnetic interference from adjacent sets. Gaffer tape was used to insulate each TV from the next to avoid eddy currents. A lot of tweaking was required to get the monitors displaying a uniform color, but, after many late nights, the wall was finally ready for the product launch. It was a great success and Multivisio went on to do some of the most memorable product launches in South Africa to date, often using videowall technology.

Bruce Genricks

In the 80's video projection was in its infancy and was based on cathode ray tube (CRT) technology. The biggest challenge with video projection at the time was brightness: a typical CRT projector could achieve only about 600 to 800 lumens. This meant that projected images were limited to about 4 meters in width, and required a darkened room.Monitors and television sets also made use of CRT technology and were limited in size to about 70cm diagonally. In order to overcome these challenges innovators in the industry started "lacing" projectors or monitors together to achieve bigger images.In 1985 the first multi-image video displays consisted of two or more projectors (or monitors) projecting side by side onto the same screen. A typical setup would consist of two videotape recorders, synchronized via time code. One time code for the left hand projector and one for the right. In order to shoot the material for this setup, two cameras mounted side by side would be used. Although other methods of splitting the image could also be used, this was the favoured method as it retained the full resolution of each image. By 1987, video tape recorders were replaced by video disc players, and by 1996 by DVD players.In the mid 1980's the first true videowall processors were introduced to the audio visual world. They consisted of large racks of equipment connected by metres of multicore cables. Because they could only handle standard PAL or NTSC resolutions they had very limited functionality; typically handling only up to four simultaneous video inputs.However, they were capable of including basic video effects and were able to freeze an image per display.In 1985 purpose-built videowall monitors did not exist. Instead, the first videowall used modified CRT television sets as the displays. The TVs were modified to accept RGB video and fitted in custom designed sheet metal cabinets. The steel cabinets served three main purposes: they allowed the TVs to be stacked; they reduced the image to image gap; and, they provided electro-magnetic shielding.Some of the early challenges faced by the technicians were matching the colours across the displays and aligning the images. Colour purity across individual screens was a problem, due the magnetic effect of adjacent monitors.Needless to say, the whole system was bulky, difficult to set-up and very sensitive.By 1985, after the introduction of the videowall processor in South Africa, events companies started making use of arrays of video projectors as displays. These usually consisted of CRT projectors mounted in custom frames projecting onto rear projection material. This was followed by purpose-built projection cubes which utilized mirrors and rigid rear projection Fresnel screens. They were also designed to stack and were specially mechanized to help with image alignment.In the early years between 1985 to 1990 of video walls, video projection and multi-image slide projection were sometimes mixed. This was especially useful large numbers of video projectors were not freely available.Early video walls had two main problems: resolution, and wide area flicker.The resolution of the early videowall was based on PAL or NTSC standards. It therefore had a maximum resolution of 625 TV lines across the entire image. This made it almost impossible to see any detail when magnified across multiple displays.Wide area flicker occurred as a result of the slow refresh rate of the image (25 frames per second). While this was slightly annoying on standard television sets, it became a real problem when displayed across a large screen, and especially so in a darkened room.In the early 1990's Video Graphics Array (VGA) was introduced as a computer graphic standard, and soon graphic videowall processors followed. This new "high resolution" standard (which at that stage was only 640 x 480 pixels) opened up a whole new market for video or graphic walls as they came to be known. New applications included control rooms and monitoring facilities which typically displayed graphical representations of processing plants, computer or telephone networks.One of the advantages of the graphic processor was the ability to display different resolutions and video standards, simultaneously, on the same wall.VGA quickly developed into SVGA, XGA and so on. Today videowall processors can display images of up to 4k (4 x HD resolution).Video projectors rapidly adopted new technologies including liquid crystal display (LCD) and later digital light processing (DLP) technology. These new techniques allowed for brighter and eventually higher resolution images. This technology was soon incorporated into videowall cubes. Videowall cubes are specially designed "boxes" which house the projector, a mirror and a rear projection screen. The mirror is used to fold the light path, thereby reducing the required depth of the cube. Another important function of the cube is to exclude any extraneous light from reaching the rear of the screen. This improves the contrast of the image. A problem with these new technologies was cost of ownership; in particular the expense of lamp replacement. In order to retain uniform brightness across all displays, it was necessary to replace all the lamps at the same time. DLP projectors also required color wheel replacements.When static, high contrast images were displayed on LCD projectors for extended periods they suffered from "image burn" or image retention. Due to this DLP was the preferred projection technology for videowall cubes.Projection cubes are still being used today although the preferred illumination source is now light-emitting diode (LED). Due to the long lamp life of LED, this has brought the overall cost of ownership down significantly.In 1995 flat panel television sets based on plasma technology were launched. The AV industry was soon using them as videowall displays. The main problem with using them in video walls was the wide frame or bezel surrounding them. Later on "bezel less" plasmas specifically designed for videowall were introduced. These did not produce a seamless image and were fragile. They were more suited to fixed installations, than to the rental industry.In 2003 large LCD flat panels with a 46" screens were launched. Although they still had large bezels, the size made them viable for videowall applications. This changed in 2006 with the introduction of thin bezel LCD displays. The next few years saw ever thinner bezels with the slimmest at about 5mm image to image. The introduction of LED edge-lit and direct-lit LCD displays have made this the most popular display technology to date.In the second decade of the 21st century the most sophisticated videowall processors are card-based and driven by powerful computers. They are custom configured for the number of displays and the number and type of source. Usually the inputs are hardwired but can also be decoded from an IP stream, using built-in decoders in the processor.Most dedicated videowall displays include scalers and daisy chain inputs. This allows a video input to be daisy-chained through the monitors, with onboard software displaying a portion of the image. In this way a large image can be displayed across multiple monitors. However, this technology is limited to a single input, with little or no effects.A software based distributed system is also available, but requires a computer per display, which is often fitted in an optional slot in the display. This setup allows for multiple images to be displayed across the display.Videowall remains an important medium for digital signage and control room applications. They are most commonly found in military, communication, surveillance and advertising applications.I was privileged to be involved with the installation of, what I believe, the first true videowall in South Africa, and one of the first in the world.In the mid 1980's Electrosonic UK started developing a videowall processing system which was to become known the Picbloc system. PIC was an abbreviation for programmable image controller. This new product line incorporated a "new generation of large scale integrated circuits".Johann Kruger, owner of Multivisio, was the first person to invest in videowall technology in South Africa. Upon hearing of this new technology, he travelled to Electrosonic in the UK to see what all the hype was about. After the demonstration of the prototype he was so impressed, that even though the product was still in the development phase, he immediately placed an order for a system for an upcoming product launch.Lourie Coetzee, who was the owner of Twin Imports and the exclusive distributor of Electrosonic products arranged the importation and logistics of this equipment. I was employed by Twin Imports as a technician, and was responsible for the technical aspects of the project. The equipment arrived and consisted of flight cases populated with 2U rack mount boxes. Each video input required a digitiser which was housed in a 19" 3U cabinet. Likewise each video output required a similar box which was called a PicBloc. Each video input required a data bus consisting of a multicore cable linking the digitiser to the first and subsequent PicBloc.It was a nightmare to setup, with frequent firmware updates. New EPROMS were shipped via courier and had to be physically replaced in each PicBloc. The modified TV sets were sourced locally and prone to magnetic interference from adjacent sets. Gaffer tape was used to insulate each TV from the next to avoid eddy currents. A lot of tweaking was required to get the monitors displaying a uniform color, but, after many late nights, the wall was finally ready for the product launch. It was a great success and Multivisio went on to do some of the most memorable product launches in South Africa to date, often using videowall technology.

For more wifi conferenceinformation, please contact us. We will provide professional answers.