Last night’s episode of “Game of Thrones” was a wild ride and inarguably one of an epic show’s more epic moments — if you could see it through the dark and the blotchy video. It turns out even one of the most expensive and meticulously produced shows in history can fall prey to the scourge of low quality streaming and bad TV settings. The good news is this episode is going to look amazing on Blu-ray or potentially in future, better streams and downloads. The bad news is that millions of people already had to see it in a way its creators surely lament. You deserve to know why this was the case. I’ll be simplifying a bit here because this topic is immensely complex, but here’s what you should know. (By the way, I can’t entirely avoid spoilers, but I’ll try to stay away from anything significant in words or images.) It was clear from the opening shots in last night’s episode, “The Longest Night,” that this was going to be a dark one. The army of the dead faces off against the allied living forces in the darkness, made darker by a bespoke storm brought in by, shall we say, a Mr. N.K., to further demoralize the good guys. If you squint you can just make out the largest army ever assembled Thematically and cinematographically, setting this chaotic, sprawling battle at night is a powerful creative choice and a valid one, and I don’t question the showrunners, director, and so on for it. But technically speaking, setting this battle at night, and in fog, is just about the absolute worst case scenario for the medium this show is native to: streaming home video. Here’s why. Compression factor Video has to be compressed in order to be sent efficiently over the internet, and although we’ve made enormous strides in video compression and the bandwidth available to most homes, there are still fundamental limits. The master video that HBO put together from the actual footage, FX, and color work that goes into making a piece of modern media would be huge: hundreds of gigabytes if not terabytes. That’s because the master has to include all the information on every pixel in every frame, no exceptions. Imagine if you tried to “stream” a terabyte-sized TV episode. You’d have to be able to download upwards of 200 megabytes per second for the full 80 minutes of this one. Few people in the world have that kind of connection — it would basically never stop buffering. Even 20 megabytes per second is asking too much by a long shot. 2 is doable — slightly under the 25 megabit speed (that’s bits… divide by 8 to get bytes) we use to define broadband download speeds. So how do you turn a large file into a small one? Compression — we’ve been doing it for a long time, and video, though different from other types of data in some ways, is still just a bunch of zeroes and ones. In fact it’s especially susceptible to strong compression because of how one video frame is usually very similar to the last and the next one. There are all kinds of shortcuts you can take that reduce the file size immensely without noticeably impacting the quality of the video. These compression and decompression techniques fit into a system called a “codec.” But there are exceptions to that, and one of them has to do with how compression handles color and brightness. Basically, when the image is very dark, it can’t display color very well. The color of winter Think about it like this: There are only so many ways to describe colors in a few words. If you have one word you can say red, or maybe ochre or vermilion depending on your interlocutor’s vocabulary. But if you have two words you can say dark red, darker red, reddish black, and so on. The codec has a limited vocabulary as well, though its “words” are the numbers of bits it can use to describe a pixel. This lets it succinctly describe a huge array of colors with very little data by saying, this pixel has this bit value of color, this much brightness, and so on. (I didn’t originally want to get into this, but this is what people are talking about when they say bit depth, or even “highest quality pixels.) But this also means that there are only so many gradations of color and brightness it can show. Going from a very dark grey to a slightly lighter grey, it might be able to pick 5 intermediate shades. That’s perfectly fine if it’s just on the hem of a dress in the corner of the image. But what if the whole image is limited to that small selection of shades? Then you get what we see last night. See how Jon (I think) is made up almost entirely of only a handful of different colors (brightnesses of a similar color, really) in with big obvious borders between them? This issue is called “banding,” and it’s hard not to notice once you see how it works. Images on video can be incredibly detailed, but places where there are subtle changes in color — often a clear sky or some other large but mild gradient — will exhibit large stripes as the codec goes from “darkest dark blue” to “darker dark blue” to “dark blue,” with no “darker darker dark blue” in between. Check out this image. Above is a smooth gradient encoded with high color depth. Below that is the same gradient encoded with lossy JPEG encoding — different from what HBO used, obviously, but you get the idea. Banding has plagued streaming video forever, and it’s hard to avoid even in major productions — it’s just a side effect of representing color digitally. It’s especially distracting because obviously our eyes don’t have that limitation. A high-definition screen may actually show more detail than your eyes can discern from couch distance, but color issues? Our visual systems flag them like crazy. You can minimize it, but it’s always going to be there, until the point when we have as many shades of grey as we have pixels on the screen. So back to last night’s episode. Practically the entire show took place at night, which removes about 3/4 of the codec’s brightness-color combos right there. It also wasn’t a particularly colorful episode, a directorial or photographic choice that highlighted things like flames and blood, but further limited the ability to digitally represent what was on screen. It wouldn’t be too bad if the background was black and people were lit well so they popped out, though. The last straw was the introduction of the cloud, fog, or blizzard, whatever you want to call it. This kept the brightness of the background just high enough that the codec had to represent it with one of its handful of dark greys, and the subtle movements of fog and smoke came out as blotchy messes (often called “compression artifacts” as well) as the compression desperately tried to pick what shade was best for a group of pixels. Just brightening it doesn’t fix things, either — because the detail is already crushed into a narrow range of values, you just get a bandy image that never gets completely black, making it look washed out, as you see here: (Anyway, the darkness is a stylistic choice. You may not agree with it, but that’s how it’s supposed to look and messing with it beyond making the darkest details visible could be counterproductive.) Now, it should be said that compression doesn’t have to be this bad. For one thing, the more data it is allowed to use, the more gradations it can describe, and the less severe the banding. It’s also possible (though I’m not sure where it’s actually done) to repurpose the rest of the codec’s “vocabulary” to describe a scene where its other color options are limited. That way the full bandwidth can be used to describe a nearly monochromatic scene even though strictly speaking it should be only using a fraction of it. But neither of these are likely an option for HBO: Increasing the bandwidth of the stream is costly, since this is being sent out to tens of millions of people — a bitrate increase big enough to change the quality would also massively swell their data costs. When you’re distributing to that many people, that also introduces the risk of hated buffering or errors in playback, which are obviously a big no-no. It’s even possible that HBO lowered the bitrate because of network limitations — “Game of Thrones” really is on the frontier of digital distribution. And using an exotic codec might not be possible because only commonly used commercial ones are really capable of being applied at scale. Kind of like how we try to use standard parts for cars and computers. This episode almost certainly looked fantastic in the mastering room and FX studios, where they not only had carefully calibrated monitors with which to view it but also were working with brighter footage (it would be darkened to taste by the colorist) and less or no compression. They might not even have seen the “final” version that fans “enjoyed.” We’ll see the better copy eventually, but in the meantime the choice of darkness, fog, and furious action meant the episode was going to be a muddy, glitchy mess on home TVs. And while we’re on the topic… You mean it’s not my TV? Well… to be honest, it might be that too. What I can tell you is that simply having a “better” TV by specs, such as 4K or a higher refresh rate or whatever, would make almost no difference in this case. Even built-in de-noising and de-banding algorithms would be hard pressed to make sense of “The Long Night.” And one of the best new display technologies, OLED, might even make it look worse! Its “true blacks” are much darker than an LCD’s backlit blacks, so the jump to the darkest grey could be way more jarring. That said, it’s certainly possible that your TV is also set up poorly. Those of us sensitive to this kind of thing spend forever fiddling with settings and getting everything just right for exactly this kind of situation. There are dozens of us! Now who’s “wasting his time” calibrating his TV? — John Siracusa (@siracusa) Usually “calibration” is actually a pretty simple process of making sure your TV isn’t on the absolute worst settings, which unfortunately many are out of the box. Here’s a very basic three-point guide to “calibrating” your TV: Go through the “picture” or “video” menu and turn off anything with a special name, like “TrueMotion,” “Dynamic motion,” “Cinema mode,” or anything like that. Most of these make things look worse, especially anything that “smooths” motion. Turn those off first and never ever turn them on again. Don’t mess with brightness, gamma, color space, anything you have to turn up or down from 50 or whatever. Figure out lighting by putting on a good, well-shot movie in the situation you usually watch stuff — at night maybe, with the hall light on or whatever. While the movie is playing, click through any color presets your TV has. These are often things like “natural,” “game,” “cinema,” “calibrated,” and so on and take effect right away. Some may make the image look too green, or too dark, or whatever. Play around with it and whichever makes it look best, use that one. You can always switch later – I myself switch between a lighter and darker scheme depending on time of day and content. Don’t worry about HDR, dynamic lighting, and all that stuff for now. There’s a lot of hype about these technologies and they are still in their infancy. Few will work out of the box and the gains may or may not be worth it. The truth is a well shot movie from the ’60s or ’70s can look just as good today as a “high dynamic range” show shot on the latest 8K digital cinema rig. Just focus on making sure the image isn’t being actively interfered with by your TV and you’ll be fine. Unfortunately none of these things will make “The Long Night” look any better until HBO releases a new version of it. Those ugly bands and artifacts are baked right in. But if you have to blame anyone, blame the streaming infrastructure that wasn’t prepared for a show taking risks in its presentation, risks I would characterize as bold and well executed, unlike the writing in the show lately. Oops, sorry, couldn’t help myself. If you really want to experience this show the way it was intended, the fanciest TV in the world wouldn’t have helped last night, though when the Blu-ray comes out you’ll be in for a treat. But here’s hoping the next big battle takes place in broad daylight.
University of Washington graduate students Katherine McAlpine and Daniel Gochnauer work in the Ultracold Atoms Group’s lab to study ultracold atoms and quantum gases. (UW Photo / Dennis Wise) Editor’s note: is a co-founder and managing director at Seattle-based venture capital firm Madrona Venture Group. He is a member of Challenge Seattle and sits on the Amazon board of directors. Commentary: This week I had the opportunity to speak at the , co-sponsored by Microsoft, the University of Washington and Pacific Northwest National Labs. The Summit brought together, for the first time, the large network of quantum researchers, universities and technology companies working in quantum information science (QIS) in our region to share quantum developments and to work together to establish the Pacific Northwest as one of the leading quantum science centers in the world. Quantum computing has the potential to transform our economies and lives. As one of the Summit speakers said, we are on the “cusp of a quantum century.” Quantum computers will be able to solve problems that classical computers can’t solve, even if they run their algorithms for thousands of years. Quantum computers are not limited to the on-or-off (one-or-zero) bits of today’s digital computers. Quantum computers manipulate “qubits” that can be one-and-zero simultaneously, which allows exponentially faster calculations. Quantum computers are expected to be able to crack present-day security codes, which is already causing scientists to work on devising new encryption protocols to protect consumer and business data and national security. Applications developed for quantum computers likely will help us overcome existing challenges in material, chemical and environmental sciences, such as devising new ways for sequestering carbon and improving batteries. Even though the Seattle area is one of the top two technology centers in the U.S., along with the San Francisco Bay Area, we have to make investments now to ensure we become a leading quantum center. To achieve this goal, I argued that we will need to substantially increase financial support to build up the UW’s quantum research capacity and equally important, to create an extensive quantum information science curriculum. The UW’s Paul G. Allen School of Computer Science and Engineering began this year to offer a course teaching Microsoft’s Q# language, but one course is not enough if we are to make our area one of the major quantum centers of the future. Madrona Venture Group Managing Director Tom Alberg speaks at the Northwest Quantum Nexus Summit this week in Seattle. U.S. Rep. Derek Kilmer, D-Wash., is seated behind Alberg. (Pacific Northwest National Laboratory Photo / Andrea Starr) Fortunately for our region, Microsoft is one of the acknowledged leaders in quantum computing and is committed to building our regional network. Microsoft CEO Satya Nadella gives credit to former Microsoft chief technology officer and research leader Craig Mundie for launching Microsoft’s quantum initiative 10 years ago. Microsoft’s goal is no less than to build a “general-purpose” quantum computer — the holy grail of quantum computing. In the meantime, they are supporting efforts to build a cadre of researchers who are familiar with quantum and capable of writing quantum programs. They have developed and launched a quantum computer language, Q#, as well as a quantum development kit and “Katas,” which are computing tasks that classical computer scientists can use to learn quantum computing skills. They are also building an open source library of quantum programs and have launched the Microsoft Quantum Network to provide assistance to quantum startups and developers. The federal government has recently launched the National Quantum Initiative, which will provide $1.2 billion over the next five years primarily to quantum researchers. The president the new law in December after the bill was approved by unanimous consent in the Senate and a 348-11 vote in the House. Among the purposes are to build a “quantum-smart workforce of the future and engage with government, academic and private-sector leaders to advance QIS.” This federal funding is welcome, even though it’s less than required for a Manhattan-style project equivalent to China’s national quantum initiative. It will be highly important to our region that our congressional delegation, several members of whom are particularly tech-savvy, advocate our case for a fair share of this funding. Our Washington State Legislature should support this by making appropriations for quantum computing and education at the UW as a down payment showing local support. There is also a role for private companies to support our quantum efforts beyond what Microsoft is already doing. I am reminded of the grants by Amazon to the UW in 2012 during the Great Recession, engineered by then-UW computer science chair Ed Lazowska to recruit two leading professors, Carlos Guestrin from Carnegie Mellon and Emily Fox from the University of Pennsylvania, to strengthen the UW’s machine learning expertise. The two $1 million gifts created two endowed professorships. Inflation has certainly raised the price for endowed professorships, but perhaps this could be repeated. Microsoft is focusing on the development of quantum computers that take advantage of cryogenically cooled nanowires. (Microsoft Photo) Another way to build our region’s quantum expertise would be for a local tech entrepreneur to follow the example of Paul Allen, who endowed five $100 million-plus scientific institutes, one of which is the Allen Institute of Artificial Intelligence, headed by former UW professor and current venture partner at Madrona, Oren Etzioni. Building a quantum workforce begins in K-12 schools with teaching computer science, which is a stepping stone to quantum information science. K-12 schools in the U.S. are woefully deficient in teaching basic computer science. Nationally, only 35 percent of high schools offer a computer science course, according to Code.org. And in low-income and minority schools this is even lower since the 35 percent reflects a lot of suburban schools which are more likely to offer computer science courses. We are beginning to address this gap in high schools, but a much larger commitment is needed. Private companies can help fill part of the gap. Amazon recently its Future Engineers program, which includes a $50 million investment in computer science and STEM education for underprivileged students. As part of this program, a few weeks ago, Amazon announced grants to more than 1,000 schools in all 50 states, over 700 of which are Title 1 schools. Studies have shown that if a disadvantaged student takes an advanced computer science course in high school, they are eight times as likely to major in computer science at a university. In addition to Amazon, Microsoft and other tech companies have programs to increase the teaching of computer science. One of those programs, backed by Microsoft, is TEALS, which organizes employees and retired employees as volunteers to teach computer science in schools. Amazon, Microsoft and other tech companies are big financial supporters of Code.org, which is having a significant effect on increasing the teaching of computer science in public schools. The Bureau of Labor Statistics projects that by 2020 there will be 1.4 million computer science related jobs needing to be filled, but only 400,000 computer science graduates with the skills to apply for those jobs. Only a tiny percentage of the 400,000 are minorities or from low-income families. A similar need exists in Washington state, with a gap of several thousand between the jobs to be filled and the number of annual graduates. In Seattle and other tech centers in the U.S., we have been fortunate that we have been able to attract and retain a very substantial number of computer scientists from other countries to fill these jobs. But with immigration and trade uncertainties, this flow is uncertain and may not be as robust as needed. Even more important, by not providing the opportunity for our kids, particularly disadvantaged children, we are short-changing them. The best way to close the income gap is to improve our public educational system so a broader segment of our population can qualify for the jobs of the future. Organizations such as the Technology Access Foundation are attacking this problem head-on by creating curriculum, recruiting minority teachers and building schools. We need to support these organizations and implement their approach broadly. At the university level, we are also deficient in educating a sufficient number of computer scientists. Even at universities such as the UW, with large and high-quality computer science schools, we are unable to fill the demand for computer scientists. The Allen School graduates about 450 undergraduate students annually. Although this is double what the school produced a few years ago, it is woefully short of the several thousand needed annually in our state. This needs to be doubled again, but funding is lacking. In short, our region needs to recommit to building our computer science workforce beginning in our K-12 schools, and undertake a new effort to build our quantum expertise and workforce.
White House science adviser Kelvin Droegemeier addresses the annual meeting of the American Association for the Advancement of Science in Washington, D.C., with a video image of him looming in the background. (GeekWire Photo / Alan Boyle) WASHINGTON, D.C. — President Donald Trump’s newly minted science adviser reached out to his peers today at one of the country’s biggest scientific meetings and called for the establishment of a “second bold era” of basic research. “I hope that you never forget that I am one of you,” Kelvin Droegemeier, who was, told hundreds of attendees here at the annual meeting of the . The University of Oklahoma meteorologist is coming into a job that was vacant for two years, in an administration that hasn’t exactly been viewed as science-friendly. The White House’s environmental policies are a particular sore point, in light of Trump’s and . But Droegemeier’s selection has gotten generally good reviews from the science community. AAAS CEO Rush Holt, a Ph.D. physicist and former congressman, took note of Droegemeier’s reputation as a “solid scientist” in his introduction. “Everyone who works with him finds him to have a very accessible manner,” Holt said. “We scientists hope and trust that this will turn into accessible policy.” In his talk, Droegemeier invoked the legacy of science adviser , who set the stage for America’s postwar science boom in 1945 with a report he wrote for President Franklin D. Roosevelt, titled Droegemeier said modern-day America remains the world’s leader in science and technology, but warned that other countries were “nipping at our heels.” “In many respects, we’re kind of thinking in the same ways that we have since World War II … and I would call that period from the Bush treatise in World War II up to the present that first great bold era of science and technology in that endless frontier,” Droegemeier said. “The past 75 years have been extraordinary, and I think we’re about to turn a page into a new frontier.” He said the second bold era would take advantage of the full sweep of America’s research assets, underpinned by American values and based on three pillars: Understanding America’s research and development ecosystem in a new context: Droegemeier called for a quadrennial assessment that takes stock of the entire R&D enterprise, including research conducted by the government, the private sector, academia and non-profit organizations. He pointed to the example of artificial intelligence research: What’s the future demand for AI, and what assets can be deployed to supercharge progress in that field? “The answer is that we don’t really have a clue,” he said. “Getting a handle on this as a portfolio is a real challenge, but in my view, if we’re able to do that, it will really help us think about how to strategically invest and move forward.” Leveraging the collective strength of R&D sectors through innovative partnerships: Droegemeier talked about rekindling the spirit of “those famous blue-sky research labs of the past,” such as . He suggested creating a network of “Alpha Institutes” to pursue “absolutely transformational ideas on some of the biggest challenges that face humanity today, like space exploration, climate change, eradicating disease and making it possible for people to live longer and healthier lives.” These institutes would be located at colleges and universities, and would be funded primarily by industries and non-profits. Ensure that America’s research environments are safe, secure and welcoming: Droegemeier said he would work with the scientific community to tackle the issue of . He said another one of his top priorities would be to make sure that “our resources do not fall into the hands of those attempting to do us harm, or those who would seek to reap the benefits of our hard work without doing hard work themselves.” And he called for “reducing the unnecessary administrative burdens that divert researchers’ time and attention away from innovating and discovery.” He estimated that such burdens cost a few billion dollars a year. After the talk, Droegemeier got a tentative vote of support from Harvard physicist John Holdren, who served as President Barack Obama’s science adviser. “I think Kelvin’s going to do a great job,” Holdren told GeekWire. He added that Droegemeier is likely to face extra challenges because he’s joining the White House team halfway through Trump’s term of office. Holdren hoped that the White House would follow up by making long-overdue appointments to the . AAAS’ Holt said Droegemeier’s speech was “a good talk,” but held off on discussing specific suggestions, such as the Alpha Institute concept. “At the moment, it’s just talk,” Holt told GeekWire.