Monday, August 28, 2017

What the Shrinking Broadcast Spectrum Means



For better or worse, the recent UHF incentive auction is over, with a net to participating broadcasters of about $10B. We have now entered into the second or transition phase of the reallocation plan, scheduled to last for 39 months, at the end of which broadcasters must turn over their auctioned-off frequencies. Options include everything from entirely exiting the broadcasting business, to ceasing all over-air transmissions and relying entirely on wired transmissions (cable or Internet), to consolidating over-air operations with a VHF station in the same broadcast area (a Channel Sharing Agreement or CSA), to migrating over-air transmissions to the VHF band. Whichever path a station chooses, if it voluntarily participated in the auction, all associated transition costs need to be paid for out of its auction proceeds. Ideally, for UHF broadcasters transitioning to an alternative form of transmission, there will be something left over to put into an annuity fund, or to be spent expanding and improving existing operations, or creating new services to grow income.

Among the stations that participated in the auction, the division of the $10B received is far from equal. 174 stations are scheduled to receive some amount of revenue, with shares ranging from a low of about $173K to a high of about $304M. The arithmetical mean value of share payments is about $57.5M, while the median payout sits near the $42M mark.

145 of the 175 participating stations indicate they plan to entirely cease all existing over-air operations, while the other 30 will migrate their over-air operations to VHF. Of the 145 stations planning to cease individual over-air operations, however, all but 17 indicate they have, by the close of the auction phase, entered into a CSA with a VHF station in their broadcast area.

Legally and financially, a CSA is straightforward. A VHF station has no way to share in the revenue from the spectrum auction except by entering into a CSA with a local UHF station. Legally, each station in the CSA then receives a separate license from the FCC for its half of the shared channel, with perhaps some restrictions on the ability of one partner to sell its half-share license (e.g., right of first refusal to the other half-share partner).

The more critical issue, which CSA stations must address before the current transition period ends, is how to make a single channel serve two stations. No doubt, the simplest solution is for the two stations to just co-brand a single broadcast stream. However, where the two stations address different audiences—as is perhaps the usual case—this solution may not be workable. Even where it is workable, it may be less than satisfactory, since it would involve two businesses splitting the revenues available from a single stream.

Assume, then, that a shared channel must be made to support two streams, one for each CSA party. Also, assume there is no general transition to ATSC 3.0 before the allowed 39 months expire (with its associated bonanza of additional bandwidth). Then, each partner in a CSA must figure out how to do everything they used to do—or, at least, everything they still want to do—in just half the 19.39 Mbps supported by a 6 MHz ATSC 1.0 channel.

One obvious answer to this technical challenge of doubling capacity in the same bandwidth is to switch from MPEG-2 compression, specified by the ATSC 1.0 standard, to MPEG-4 compression. Since, compared to MPEG-2, MPEG-4 compression is between three and four times better, this would not only allow each station to continue broadcasting a full HD stream but also simulcast a lower bitrate SD or mobile stream alongside it. In short, levering MPEG-4 compression technology, each partner in a CSA could not only continue to do everything they used to do in 6 MHz (while using just 3 MHz), but they could also provide new services, with associated opportunities for new revenue.

But is this feasible? To be sure, some existing over-air receivers can now take a MPEG-4 signal successfully, and more will have this ability by the end of the 39-month transition period. But how many? Is this figure closer to 10% or 90% of the potential audience? Assuming it is not 100%, at what point does the opportunity to gain revenue, with new services empowered by MPEG-4, outweigh any losses from a smaller MPEG-4 audience base? After all, a simulcast aimed at a mobile audience might reach everyone with a cell phone—doubtless, an even bigger audience than everybody with a TV. Does this make it worth taking the plunge at 60% of TV households? Or Lower? At 80%? Or higher?

If the MPEG-4 option is ruled out because of the potential audience loss involved, what are the other options? Are these more or less expensive? More or less audience-friendly?

What certainly is no longer in doubt is that stations entering into CSAs have a total of 39 months to figure out how they are going to share an existing channel. If you have an opinion on this subject that you would like to share, please click the link below to leave a comment on this blog at our LinkedIn page.

Here at Telairity, we do it all when it comes to signal encoding, and we do it with capital costs in mind. Attractive prices, along with high-quality pictures, low bit rates, simple setup, and full software upgradeability have made Telairity a leader in the industry. Get in touch with us, or follow our activity on LinkedIn or Twitter, and see how we can work with you now and into the future.

Thursday, August 10, 2017

The Shrinking Broadcast Spectrum



Once upon a time, the radio frequency (RF) spectrum (ranging from 3 Hz to 3000 GHz) was used exclusively for radio (wireless sound transmission)—when it was used at all. Indeed, from the 1860s (when James Clerk Maxwell first postulated the existence of radio waves) to the 1940s, there simply wasn’t any other practical use for radio waves. Then commercial television began elbowing its way into the RF spectrum, to transmit images together with sounds. After TV came cell phones, which initially wanted RF spectrum for ordinary telephony uses, followed by email, followed by pictures, video, text messaging, and graphics. Today, a horde of untethered devices, ranging from L-Band satellites to RFID-tagged inventory items, all clamor for a share of the RF spectrum in an ever-expanding global communication network.

For broadcasters, the result has been a steadily shrinking spectrum. Frequencies assigned long ago, in a bygone era when the only wireless devices most people possessed were their radios and TVs, have been reclaimed and reallocated for other uses. And since, in an increasingly mobile world, the demand for wireless communication continues to grow apace, in recent times, every MHz of RF frequency spectrum a broadcaster continues to hold has rapidly escalated in value. As a consequence, the choice made by a broadcaster to surrender or retain RF frequencies has become a decision to realize or forego many millions of dollars.

Analog-to-Digital Shift and the 2 GHz Relocation


For broadcasters, the most momentous consequence of the growing demand for wireless spectrum was the mandate requiring that full power stations complete a transition from long-familiar analog to new digital technology by mid-2009, with most low-power stations following by the end of 2011. In addition to any other benefits (including an end to signal-transmission problems like ghosting and snow), this shift to digital technology greatly increased the efficiency of RF usage. Better efficiency not only enabled a long-delayed shift from SD to HD resolutions but also allowed broadcasters to surrender frequency while gaining functionality.

Thus, part of the overall digital transition was the exchange of seven 17/18-MHz wide analog Broadcast Auxiliary Service (BAS) channels for seven narrower 12-MHz wide digital BAS channels. Thus, where the old analog BAS band had run from 1990 to 2110 MHz, the new digital band started instead at 2025.5 MHz. The resulting reallocation gave the lower 30% of the old analog band (1990 to 2025 MHz) to Sprint/Nextel, for incorporation into their adjacent Personal Communications Service (PCS) band (1850-1990 MHz). But this diminution in BAS bandwidth involved no sacrifice by broadcasters. To the contrary, they profited from it—twice over. First, in exchange for the extra bandwidth, Sprint/Nextel agreed to pay for all the equipment broadcasters needed to convert their existing BAS usage from analog to digital, e.g., new digital HD cameras and MPEG-4 encoders, transmitters, and receivers. Second, once equipped with new digital gear, broadcasters were able to do far more with a 12 MHz BAS channel than had ever been possible with an 18 MHz analog BAS channel (e.g., transmit two HD signals across it simultaneously).

Auction of UHF Spectrum

 

Since 1994, the FCC has sporadically auctioned off spectrum reclaimed either through shifts in transmission frequencies (like the 2 GHz relocation) or consolidation of operations. Most recently, however, they commenced a first-ever “incentive auction” (authorized by Congress in 2012) of 84 MHz, taken from the UHF TV broadcast spectrum. TV stations using these frequencies could voluntarily decide to relinquish them, in exchange for a share of the auction proceeds. The initial auction phase of this spectrum reallocation initiative took one year, starting March 2016 and ending March 2017, with a total of $19.8B raised (setting the market price of 1 MHz of spectrum at just under $236M or, on a national MHz-pop scale, at $0.73). Those UHF broadcasters that volunteered to surrender spectrum are dividing the lion’s share of this revenue (just over $10B), with the government claiming most of the rest of the proceeds (over $7B) for use in “debt reduction.” Except for 14 MHz, all the freed UHF spectrum goes to phone companies, who plan to incorporate the additional bandwidth into new 5G networks.

While $10B might seem generous—even extravagant—compensation for surrendering 84 million Hz out of a total spectrum of 3,000 billion Hz, hopes for this auction were more elevated. The expectation was that what, at first, seemed a frenzied demand for spectrum would drive bidding far higher, past $60B. Far from the maximum amount hoped for, then, $10B was the minimum figure at which broadcasters were willing to part with this much spectrum.

(To be continued in part 2 of this blog)

Here at Telairity, we do it all when it comes to signal encoding, and we do it with capital costs in mind. Attractive prices, along with high-quality pictures, low bit rates, simple setup, and full software upgradeability have made Telairity a leader in the industry. Get in touch with us, or follow our activity on LinkedIn or Twitter, and see how we can work with you now and into the future.

Wednesday, June 7, 2017

The Asian DTH Market Gears Up for Netflix Competition

The television broadcast industry has undergone several rounds of radical transformation over the past few decades. Free-to-air transmissions gave way to paid cable alternatives which, in turn, had to make room for Direct-To-Home (DTH) satellite services. CRTs vanished, replaced by larger flat panel screens. The venerable standard-definition resolution standard was superseded by a new high-definition standard, which was itself superseded by a still newer ultra-high-definition standard. Traditional analog broadcast technology was everywhere replaced by computer-age digital technology. And then, of course, there was deregulation, the advent of the smart phone and tablet, and the rise of the Internet, which between them changed everything again.



The Over the Top Netflix Phenomenon: 

One of the most remarkable successes of the new age of broadcasting is Over the top (OTT) video on demand (VOD) transmission across the Internet. OTT transmission has spawned a whole new generation of “cable cutters,” who spurn traditional broadcast television with its rigid schedules and fixed choices, in favor of free or low-cost subscription services, with a virtually unlimited range of content available anytime on any connected device, be it a smart phone, a tablet, a notebook, a computer monitor, or a big-screen smart TV. Two of the most popular new OTT providers are Netflix and Hulu. Although these and other OTT companies like Amazon and You Tube have significantly disrupted Western markets, Asian DTH service providers claim to remain unperturbed, because, according to them, OTT can’t beat DTH in the foreseeable future.

Skinny Bundles & Better Transmission Quality: 

The main attractions of OTT services like Netflix and Hulu are personalization and binge watching. While these attributes can’t be brought to standard broadcast services, whether provided over air, cable or DTH, broadcasters are ditching their overwhelming and confusing 400-channel lineups to go with linear bundles. These bundles provide users with less to choose from, which  is a better strategy to promote longer “view” times. Many Asian markets are going with streamlined packages that cover news, preschool children’s programming, premium factual content and lifestyle.
Just as broadcasters are transmitting less and becoming more quality conscious, fleet operators are also looking into ways to enhance the viewer experience, fully capitalizing on HD and, in some cases, 4K device proliferation.

Telairity Helps the DTH Sector Excel: 

Telairity has always lived up to its promise of providing the best SD and HD encoders in the industry. It provides key DTH components like:
  • High-compression Encoders so HD transmissions won’t overload broadcast capacity.
  • Professional Decoders to decompress, descramble and reformat signals.
  • Premium modulators that generate L-Band and IF-Band satellite signals fully compliant with the DVB-S/S2, DSNG standards.
If you want to learn more about Telairity’s capabilities, please contact us.

Monday, May 15, 2017

Live Remote Broadcasts from Disaster Areas

In the event of a serious storm or unexpected disaster, the best way to get information is still from live television broadcasts.  While slowly developing weather events, like hurricanes and blizzards, can be forecast and tracked in advance, other events, like tornados and earthquakes, often occur suddenly with little if any advance warning.  In both cases, it is important that the public see what is going on, where the damage is, and even if evacuation is needed.  Live remote broadcasts from these locations can supply the this information to a broad audience in real time.

Reporters do not hesitate to go on scene to investigate breaking news where and when it is unfolding, in order to share that information with the general public.  Everyone has seen the TV shot of a reporter battling the wind or snow while doing a live report, but little attention is paid to the technology needed to make this broadcast happen.

At Telairity, we design and manufacture video processing solutions expressly designed to withstand the rigors of live field news reporting. Telairity has been an industry leader in this area for years with our BC8110 28V DC encoder for aircraft, and our BE8110, BE8500, and new Nexgen BE8600 encoders for ground vehicles.  With reliability features like dual power supplies, usability features like “instant on,” and encoding capabilities that include both high bitrate 4:2:2 10-bit mode for capturing the best possible images for archiving and editing, and low bitrate 4:2:0 8-bit mode for backhaul over crowded urban airways or direct-to-viewer distribution, Telairity contribution encoders have been durable, responsive workhorses for all types of ENG (electronic news gathering) vehicles since their introduction in 2008.


For more information on the products Telairity offers that can meet the demands of live remote broadcasts from difficult situations and locations, please contact us.

Thursday, April 27, 2017

New Carrier ID Satellite Standards

Close to two decades ago, satellite transmission was plagued by the menace of interference. As the number of content providers increased, uplink equipment pumped out endless videos and data for end user (customer) consumption. This ever-increasing number and variety of satellite transmissions led to rising rates of interference, as competing signals jostled for space in increasingly crowded RF bands.

What could the satellite operators do to ensure quality transmissions and smooth uninterrupted viewing for clients? If they couldn’t identify the interfering signal and its location or source, the matter couldn’t be resolved.

The Satellite Interference Reduction Group (IRG) was formed in the late 90s with the sole purpose of solving this issue. Since it isn’t feasible to completely eliminate RF signal interference, the breakthrough came in terms of carrier identification (CID). CID allows the source of satellite signals to be identified, providing receivers the ability to quickly contact senders to clear up interference whenever it occurs.

The Carrier Identification or CID: 

CID is a special marker that is injected into satellite signals by uplink modulators. The additional CID data includes:

  • 64-bit MAC address
  • Vendor serial number 
  • Other user configurable data, such as GPS coordinates, the carrier name, and user contact coordinates

Special measurement receivers placed with satellite operators can isolate CID-enhanced signals, trace them back to their uplink source, and identify the broadcaster. Resolving the interference then often requires nothing more than  a simple phone call to sort out the frequency mismatch.


Issues with Carrier ID Insertion & WBU-ISOG’s New Resolution: 

For the CID standard to be effective in managing the growing problem of interference between competing satellite transmissions, all signals relayed via satellite must include the new Carrier ID data. To facilitate universal compliance with the new CID standard, the WBU-ISOG has issued resolutions that mandate:

  • Inclusion of CID functionality in the specifications of new equipment given to manufacturers 
  • The presence of CID functionality in all new model modulators and codecs with integrated modulators
  • The phasing out of the existing CID NIT standard in favor of a more standardized ETSI carrier ID

By not leaving the use of CID to chance, broadcasters and satellite operators can finally put an end to interference – or, at least, to the kind of interference that can’t be resolved simply because the source of an interfering signal can’t be identified.

Telairity is doing its part in the struggle to control satellite interference by supplying state-of- the-art encoders, decoders and modulators that comply with the latest CID specifications.

Thursday, October 6, 2016

The Mobile Revolution is Underway

Recent research has revealed something extraordinary in the broadcasting world. Over the past five years the consumption of video on mobile devices and other handhelds has risen by an enormous 2084%, with no signs of slowing down. The spike in the last quarter of 2015 was an impressive 35% and it has brought the cumulative bump to 170% since 2013.



The millennials, the first real ‘mobile generation’, are driving the popularity of live and recorded video streaming on devices that fit in their pockets. This segment is always on the move and prefers to stay in the know by catching up with global events and important trends not only through videos on YouTube and other websites, or through branded visual content created by companies as part of their marketing outreach, but by watching live breaking news made available by local stations or national networks.

These tendencies give rise to a unique set of challenges for video marketers: 

  • The devices preferred by millennials vary vastly. Some like to stick to the 5 inch screen on their mobile phone, while others enjoy the greater size and resolution available on an iPad or other tablet. Videos should satisfy a spectrum of resolution requirements to serve different users and their handhelds of choice.
  • Internet and Wi-Fi connectivity varies greatly from place to place, and even from time to time in the same place. Metropolitan areas will typically have much better bandwidth and download speeds than more rural areas. A video that stutters and lags is likely to be counter-productive, destroying not only the entertainment value but any commercial value the video may have. This problem is especially acute for real-time video, which must be encoded dynamically either with best quality in a least common denominator format, or simultaneously encoded in multiple formats for multiple client devices.


Telairity Mobile Encoders are the Future: 


With its three encoding systems, namely the BE8600 contributions encoders , the SES3200 dense encoding system, and the multichannel BE8700 distribution encoder, Telairity provides powerful solutions for real-time mobile video.
  • The compact “go anywhere” single-channel BE8600 encoder is easy to deploy anywhere needed to capture live camera output for redistribution. With a choice of resolution standards and audio codecs, it can always provide the right combination of picture and sound for mobile devices. 
  • Video feeds passed through the cloud to the SES3200 can be simultaneously rendered in up to 32 formats and resolutions to simultaneously service both fixed and mobile screens of all types and sizes. 
  • Less extensive formatting requirements can be satisfied by direct or cloud-based feeds to the BE8700 for simultaneous 4-way rendering, using Telairity’s optimized distribution encoding for the highest quality real-time images at low bitrates.

If need reliable, real-time equipment able to service mobile in addition to standard broadcast needs, the streamlined Telairity video encoders tick all the checkboxes. Please contact us here for more information. 

Wednesday, August 31, 2016

Telairity Dives Deep Into 4K Technology – Part 4

The value of UHD over HD is that it allows us to get closer to screens of the same size, or view
bigger screens at the same distance, with no change in visual quality. In either case, the screen
will appear bigger to us, i.e., occupy more or our total viewing area. And that, we said, means
UHD enables a more immersive or higher quality viewing experience.

This improvement, however, is not free. Its cost is quadrupling the number of pixels per
display, from about 2 million to about 8 million. What are the implications of multiplying
pixels?

Digitally speaking, every pixel is a number, specifically a binary number that represents a
specific color shade. For each pixel, the display reads its number, and generates the colored
block appropriate for that number in the location appropriate for that pixel in a size
appropriate to the resolution format for a display of the given dimensions.

The pixel numbering standard in common use today for broadcast television is so-called “8-
bit” color, which generates a binary number 24 bits long for each pixel, sufficient to enable a
total palette of over 16 million colors.1 Since 16 million is more color shades than even the
most discerning human eye can distinguish, 8-bit color (24 bits/pixel) is sometimes called
“true color”, as the first and simplest digital color scheme to enable everything the human
eye can see (and more).2

The problem created by digital imagery in general, and HD and UHD television in particular,
isn’t that digital technology is inferior to older analog technology, or that it is inadequate to
express the full range of our senses. It is simply that digital technology able to provide a high
quality experience takes a lot of bits, and improvements in quality take even more bits.

Specifically, an HD picture composed of 2 million pixels, each corresponding to a 24-bit
number, requires 48 million bits to express. And that is just for a single frame. Full HD plays
out at 30 frames a second, meaning a total bit rate of nearly 1.5 billion bits every second.

This is not just a large number; it is an overwhelming number. It is impractical to store 1.5
billion bits for every second of HD video captured, let alone transmit bits at that rate.
Fortunately, there is a powerful remedy for the proliferation of bits required by digital
rendering technology, namely digital compression technology. Compression technology is
especially powerful for video, where standards like H.264 allow the elimination of 299 bits
out of every 300, reducing 1.5 billion bits a second to a much more manageable 5 million bits
a second.

But what happens to data rates when the television industry shifts from HD to UHD? In the
next part of this series, we will look at the dark underside of the move to UHD display
technology.

Telairity has made a name for itself as the industry’s leading video processing solutions provider. Please write in to us at sales@telairity.com to learn more about our products and to collaborate with our team.


1.Why are pixels 24 bits long described as “8-bit color”? It’s because “8-bit color” refers not to pixel length, but
rather to “channel” length, or the number of bits used to encode each of the 3 primary colors (Red-Green-Blue)
that make up a pixel. Adding the 3 8-bit primary color “channels” together gives the overall total of 3 x 8 or 24
bits/pixel. There are 256 8-bit binary numbers (possible combinations of 1s and 0s between 00000000 and
11111111). Thus, an 8-bit channel provides 256 distinct shades each of Red, Green, and Blue, or 256 x 256 x 256
= 16,777,216 “mixed” colors.]

2.Although the long strings of 1s and 0s that comprise binary numbers can seem quite daunting on first encounter, understanding binary numbering is really very easy. The basic rule is just that every bit added to a binary number doubles the number of possible combinations supported. This can be seen most readily by starting at the beginning, with 1 bit, which has only 2 possible values (0, 1). Adding a second bit allows 4 possible values (00, 01, 10, 11). And so on: 3 bits have 8 possible values (000, 001, 010, 011, 100, 101, 110, 111), 4 bits have 16 possible values, 5 bits 32 possible values, etc. By the time you reach the 8-bit values used in “true color” RGB encoding, this doubling algorithm has passed by 64 (6 bits) and 128 (7 bits) to reach 256 possible combinations. The doubling rule itself is most readily understood by the fact that adding a bit simply allows us to write all the numbers of the previous set twice over, the first time tacking a 0 on to the front of all the previous numbers, the second time tacking on a 1 (e.g., compare the 8 3-bit values with the 4 2-bit values shown above). 8 bits is regarded as “true color” since it is the first channel value safely past the outer limits of human color perception. That is to say, if you build up a color bar out of 256 strips, each with an adjacent shade of, for example, red—running from a strip of pure red on one end to a strip of pure black (no color) on the other—this color bar will not appear to the eye as 256 distinct stripes, but rather as a single continuous gradient, shading from red to black by insensible steps. Which is to say, when a color is divided into as many as 256 distinct steps, we have moved below the threshold of noticeable differences between adjacent steps—in other words, no one can tell shade 1 from shade 2, shade 2 from shade 3, and so on down the row of 256 shades. In fact, for most people, the same would be true of a color bar built up from 128 strips (7-bit channels), but the very keenest eyes under ideal conditions might be able to distinguish very faint stripes in this bar. So 7-bit color channels (128 x 128 x 128 = 2,097,152 mixed colors) are not quite past the limits of human perception. But 8-bit color channels, which multiply the number of mixed colors by 8 (= 2 x 2 x 2), are easily sufficient to include not only all the colors anyone might ever be able to distinguish under any circumstances, but many millions more besides that no one can tell apart from their neighbors.]