Category Archives: Technology

Thursday Review articles and reviews of technology, energy, computers, devices.

Seattle’s Dig: Bertha is Still Stuck, for Now


Seattle’s massive tunnel project, featuring as its star the boring machine Bertha, is at a standstill while engineers complete repairs. With the project two years behind schedule, is it possible for this mega project to be completed within its budget?  Read the full Thursday Review article here.

Earth’s Biggest Risk: Solar Storms

Image courtesy of NASA

Image courtesy of NASA

Thursday Review examines the risk posed by solar flares, solar storms, and mass coronal ejections–phenomena now believed to threaten Earth’s power grids, communications and electronics; should we be prepared?  For more follow the link to the article on our Features Page and Front Page.

The Really BIG Apple!


Image courtesy of Apple

Image courtesy of Apple

Thursday Review looks at how Apple became the company with the biggest profit of any company in the history of bookkeeping (in any country), and how the California-corporation has more cash than any entity in the world. But can Apple keep it up? Or will it reach a plateau? Read the article: The Really BIG Apple; Thursday Review staff; January 30, 2015.

California High-Speed Rail Project Under Way



Thursday Review’s Alan Clanton takes a careful look at the contentious, controversial high-speed rail project now under construction in California. When completed in 2033, it will been the most expensive public works project in U.S. history, but–its proponents say–it will also be a game-changer for transportation and high-speed travel. Read more: California High-Speed Rail Project Under Way. Or find it on the Thursday Review Front Page.

Hacker Versus Hacker


Thursday Review examines the new and potentially complex question of business cyber wars: when is it appropriate for U.S. companies to engage in cyber retaliation against criminal hackers and thieves?  And in the context of recent cyber attacks against JPMorgan Chase and Sony Pictures, what options are available to law enforcement in the U.S. and other countries.  See more at

High Speed Rail: Hurry Up & Wait


Thursday Review’s R. Alan Clanton looks at how three states (California, Florida, Texas) are bringing the long-deferred dream of high speed rail to the start of reality. Will these expensive mega projects be useful in helping Americans commute from city to city as fast as regional air travel.  Second in a series of articles:

Orion is Older Than You Think

Photo: NASA/Bill Ingalls

Photo: NASA/Bill Ingalls

Thursday Review looks at the long history of the Orion missions; these deep space voyage plans date back decades, to a time even before Gemini or Apollo. Are long-distance manned space voyages on the horizon? See more:

Movie Heist: Did North Korea Hack Sony Pictures?

Image courtesy of Sony Pictures

Image courtesy of Sony Pictures

By R. Alan Clanton, Thursday Review editor

The production, marketing and release of major motion pictures are monumentally costly undertakings. Many millions are spent—sometimes $25, $30, $40 million or more—just on filming, editing and talent. The studios have an expectation that these movies, once projected onto big screens across the world, will rake in at least a modest profit.

In fact, the business model has evolved so completely over the last two decades that few films reach the shooting stage without first being subjected to a long, grueling process of approval by the powers-that-be. In the corporate model which now dominates the movie industry, few films reach the theaters without first being carefully measured for their capacity to make money for the studio, the parent company, and the stockholders.

So when word of the massive theft at Sony Pictures—a hack job which resulted in a dozen movies being digitally offloaded in their entirety—hit the streets of Hollywood and New York, it sent a shudder through the spines of anyone and everyone who has ever worked in the film business. The data breach at Sony Pictures resulted in, among other things, a premature online release of the new movie Annie. Annie was not scheduled for theatrical release until close to Christmas. Now, by conservative estimates, the movie has already been downloaded half a million times since the security breach was discovered less than one week ago. In fact, Annie is being downloaded at the rate of 500 units per minute worldwide even as you read this article. By tomorrow, industry analysts suggest, Annie will be available—for free—to more than 2 million viewers.

Sony has enlisted the FBI, as well as the services of several expensive private security teams to analyze the breach and halt the digital hemorrhage. But for Annie, the damage may already be financially catastrophic.

But Annie was not exactly the true target, at least according to some theories. In one of those strange cases of life imitating art—or vice-versa (sometimes it’s hard to tell the difference which comes first)—and politics imitating comedy (think of Saturday Night Live’s uncanny parody of the failure of the health care website rollout)—North Korean hackers may, and we stress may, have been directly responsible for the security breach at Sony. The reason? Sony was weeks away from the release of a fictional comedic take on the political thriller called The Interview, a story in which two American guys—posing as amateur web journalists—are sent by the CIA across the DMZ into North Korea with the task of assassinating the North Korean leader, Kim Jong-un. The movie stars Seth Rogen and James Franco.

Sony Pictures has multiple teams of security clean-up crews fixing the damage caused by the breach. In addition to the theft of digital copies of entire motion pictures—at least five of which have already been downloaded millions of times within the last few days—the hackers also crashed most of Sony’s computer system, disrupting databases and making email delivery and receipt impossible. Sony has hired contractor Fire Eye’s “Mandiant” crew to repair the damage and get all systems back online, but it may take a few more days before all loopholes are closed and all network operations are back to normal. Sources inside Sony have revealed to some in the media that the security breach, in terms of cost and scale, may be bigger than last year’s Target hack, or this year’s massive Home Depot data breach.

Though law enforcement has not made any comment publicly on where it is looking, dozens of sources—both those with knowledge of the FBI and those with direct connections inside Sony—have indicated that the cyber-attack may have been retribution by North Korean techies in the service of Kim Jong-un, who has called the farcical movie “an act of war” and an “aggressive form of cultural attack.”

When the computer system crashed last week, employees at Sony say that most screens displayed a dark red skull with the words “hacked by #GOP.” And no, that’s not the Grand Old Party we think of generally as Republicans, but a group allegedly calling itself the “Guardians of Peace.” In the meantime, as thousands of people in the business world and the Hollywood movie industry have noted, emails sent to Sony Pictures employees are immediately bounced back. In the meantime, all Sony business is being conducted old school: by phone, by fax, or by Xerox machine.

Can North Korea claim victory on this attack? Neither the FBI nor other law enforcement agencies are commenting—at least in specific ways—but there have plenty of indications that U.S. agencies ae looking directly at North Korea as the perpetrator of the attack. According to several major news sources, law enforcement officials who are speaking off-the-record say that the Sony cyber-heist has Pyongyang’s thumbprints all over it. One can only assume that the damage is real and measurable, especially when calibrated by the revenue apparently lost because of films prematurely released online. Besides the new Annie, the other films apparently stolen in the breach include Mrs. Turner and Fury. Fury, a war movie, directed by David Ayer and starring Brad Pitt and Shia LeBeouf, opened in theaters last month, but the illegal downloads of it also reached the thousands per hour as of this past weekend. According to the film website IMDb, Fury has already grossed about $82 million. But the illegal downloads may quickly suppress future profits.

Sony Pictures’ data breach would be the largest such single cyber-attack to hit a major motion picture studio.

North Korea has made no official comment on the brouhaha. But many in both the foreign policy arena, as well as the movie business, recall that the isolated country—which sits north of the demilitarized zone established by the United Nations at the end of military hostilities more than 50 years ago—was not amused by the thought of an American-made movie about the assassination of its dear leader, parody or otherwise. In June 2014, a spokesperson for the North Korean government declared that all North Koreans were being challenged to “mercilessly destroy anyone who dares hurt or harm the supreme leadership of the country…even one bit.” Serious words. Except that those who follow the daily narrative from Pyongyang know that such harsh language is par for the course, as it were.

Like its larger neighbor to its north and west (China), and like the rogue Iran, North Korea has established a specialized military unit whose sole purpose is cyber-warfare. This brigade of 1500 techies—dressed in army uniforms (unlike their counterparts in California in blue jeans and black t-shirts)—is tasked with engaging in digital battle, and it is empowered to ignore web etiquette and international law. In North Korea, this cyber-warrior battalion is called Unit 121. Some U.S. security experts have suggested that the Sony Pictures data breach has all the markings of an attack by the loyal shock troops of Unit 121.

Ironically, North Korea has one of the tiniest internet footprints in the world. By some estimates, less than one percent of its population has any internet access at all. Those with web access are either top military, high government officials, or those members of Unit 121—and even then web access is greatly limited and activities closely monitored. North Korea’s small internet footprint means that proof of its authorship or sponsorship of the attack will be difficult, and retaliation may be close to impossible.

North Korea has a tradition of being easily riled, and many experts say that if the data breach at Sony turns out to have its roots in Pyongyang, it may be an indication of more trouble for American companies in the future. North Korea has been the chief suspect in several cyber-attacks against banks and financial companies in South Korea in recent years, as well as in a major cyber-attack on South Korean television and radio broadcasters in 2013.

Security experts and law enforcement say that cyber-attacks and data heists are what crime will look like for the foreseeable future. Gone are the days of guys with guns hijacking trucks carrying reels of film, boxes of videotapes or stacks of CDs. In the place of this kind of strong arm crime is a new kind of criminal who uses the computer to steal digital data. Since much of the motion picture industry is moving toward fully digital production and editing processes, this means that the heist at Sony Pictures may be the first of many to come.
– See more at:

Tesla: A Case of Supply Versus Demand


By Thursday Review staff

Billionaires can afford to lose some money, and sometimes it makes perfect sense—especially if what they seek is a game-changing result for all that toil and investment.

Elon Musk’s automotive division, Tesla Motors, just closed another unprofitable quarter, reporting to investors a loss of about $74.6 million for the third quarter of 2014. This is about twice Tesla’s loss when compared to the same quarter in 2013. But that’s only the bad news for the unconventional car maker: Tesla has seen its overall sale steadily increase.

More importantly, Tesla’s sales figures topped the industry analysts’ estimates, confirming Musk’s vision as being less risky than some have suggested. Tesla has been selling its cars at a healthy clip, and figures indicate that it may reach $1 billion in overall sales by the end of this year. Currently, Tesla is reporting nearly $932 million in sales for the first nine months of 2014—up from about $600 million for the same period last year.

Musk and his team at Tesla explain the discrepancy this way: production of cars cannot keep up with demand, and, as a direct result, lots of extra cash is being spent trying to find a way around the production problems. Tesla, which is plans to invest billions to open a massive battery plant in partnership with Panasonic, has been hampered by the inability to produce its innovative lithium-ion batteries fast enough. Though demand for the pricey cars continues to rise, the battery problem remains intact.

Still, industry experts see fortune on the road ahead for Tesla. Battery partner Panasonic—convinced of the potential profits for Tesla’s green cars—has committed to a long-term investment in producing lighter, more efficient batteries. So sure is Panasonic of future earnings, the Japanese conglomerate recently announced plans to begin ramping down its production of many other consumer products, redirecting its energies toward smart batteries and green car components.

Tesla also plans to revamp its current factories to improve production of its cars, which include its bigger seller, the Tesla Model S Sedan. In addition, Tesla hopes to be ready no later than one year from now to be selling its newer Model X.

Musk does not see demand as the central issue, and many automotive analysts agree. Musk’s problem, ironically, is producing the vehicles fast enough. The factory retrofit may help, as Tesla intends—if at all possible—to reach an output of 100,000 new cars during 2015.

“Despite losing almost a month of production due to factory retooling,” Musk said in a press statement posted on the Tesla website, “we delivered the highest number of Model S vehicles ever, with several new records set in North American and worldwide.” Tesla also hopes to continue its rapid construction and maintenance of its recharging stations, which are currently located in about 126 locations across North America (about 123 of these are in the contiguous 48 states). Tesla hopes to roughly double that number by the end of 2015.

Tesla car-owners can use Tesla’s Supercharger stations at no cost, and cars can typically be recharged between 20 to 30 minutes, depending on the life left in the battery when charging begins. The Tesla website says that “optimal” charging takes about 40 minutes. Tesla currently has its stations located in strategic spots along major highways and interstates, and its marketing material invites car owners to consider taking the time to get a meal or shop while the vehicle is being recharged. Tesla cars can also be charged overnight in a driveway or garage.

Delays rolling out the new Model X, which is a small SUV, have also hampered some of the sales estimates. On this point Musk accepts responsibility personally, telling reporters that he sometimes lets his perfectionist side interfere with the calendar and timetables. But most auto analysts and green energy analysts see Tesla’s long range plan as solid, and cite Panasonic’s massive investment as evidence that others in collateral industries must agree wholeheartedly with the positive outlook for the California carmaker.

– See more at:

Apple: Bent, But Not Broken

Image courtesy of Apple

Image courtesy of Apple

By Thursday Review staff

When something goes wrong at Apple, it can be a big deal. The company has long been associated with cutting edge technology and the coolest of the cool apps and gadgets, and its empire is built largely upon the reliability and cache of its sleek, dazzling products.

Back years ago, when some Apple phones had a problem with reception and clarity because of a glitch antenna, Steve Jobs famously (and clumsily) suggested that people simply try holding the phone in a different way. That official reaction to a product problem caused a minor marketplace and industry fracas, and internally Apple resolved never to go down that path again.

So when the release of its new iPhone 6 was accompanied—24 hours later—by a high-impact problem, the blowback was immediate. Apple apologized to its customers, acknowledged that it had an issue, and set to work immediately to resolve the glitch as quickly as possible.

The new phone, which went on sale days ago amid lots of publicity and long lines of Apple aficionados (many of whom camped out for days to buy the new devices), contained many tre’ cool features, including a thumbprint tool and a payment platform. But one small thing the device was also supposed to do was make phone calls, and a glitch (in its iOS 8 mobile software) prevented some iPhone users from doing exactly that.

In addition, the newest phone, which is slightly larger than previous versions, has faced a withering storm of concern over its inability to sustain more than a scant few degrees of bending. As of early this week, Apple had already received an undisclosed number of complaints and returns by customers whose new phones broke or were damaged by bending, in most cases as a result of customers placing phones—as is common practice—in hip-pockets or back pockets.

On Facebook, Twitter and other social media, the brouhaha has been termed “Bendgate,” and the controversy has led to a lot of back and forth over the issue of low-rider jeans, tight jeans, and the kind of jeans that would otherwise appear to painted on save for the outline of the phone. In addition, the lightweight aluminum and composite shell—which is obviously unable to sustain the routine pressure placed on it if its user sits on it while it is stored in a pocket—may be a case of less is less, rather than less is more. Some newer phones have also been shown to bend very easily when stored in a front pocket, and in cases where the user leaves it in the front pocket when climbing into an automobile or sitting at a desk.

Apple went to work immediately on the problem of the software, and provided users with ways to resolve the glitch. One easy solution was to reverse the software update, and Apple quickly provided guidance on how to do that. Many other customers simply carried their iPhone 6 back to the nearest Apple store to get a quick fix implemented on site.

The issue of the bending and breaking, however, may be more problematic for both Apple and its millions of loyal customers. At issue, at least in part, is the age old struggle between form and function. Customers often express an interest in smaller, lighter phones and handheld devices. But smaller can sometimes be too small, especially depending on the type of applications being used. So, some phones are larger—offering decidedly more surface area and screen space.

This tug of war often pushed phone design toward the very edge of what one might call a tablet (larger phones), while other phones meet the demand of those users who simply want a small enough device to be placed in a purse, pocket, or backpack. Optics also plays a part, as some applications on a phone require more surface area than others. Apple and Samsung have been engaged in a small rear-guard battle over optimal surface area for several years, but neither side has prevailed in a market where new applications are introduced almost on a weekly basis. The iPhone 6 is slightly larger in surface area than its Apple predecessors; the iPhone 5s, for example, is about 4.1 inches long, whereas the new iPhone 6 is nearly 5.5 inches long. More surface area means more chances for bending when stored in tight spaces, like front pockets.

Most manufacturers of phones—Samsung, Nokia, LG, Apple, Sony—officially discourage users from placing phone in pants pockets (especially back pockets!), since the resulting damage can sometimes be fast, and extreme. Designers with several of the major device makers have been experimenting off and on for several years with materials which will allow for maximum flexibility in pockets and other tight spaces, though no phone—as yet—offers more than a tiny degree of flex before damage can occur.

Apple says that only a handful of customers have complained directly about the bending problem (on Thursday the company said it had received only nine genuine complaints), but only time will tell if Bendgate will go away.

Still, despite a few glitches and a lot on chatter on social media (a You Tube video on the bendgate problem has been viewed more than six million times this week), Apple’s rollout of the iPhone 6 has been a sale success. As of Monday it had shipped out approximately ten million units to retailers and buyers.

– See more at:

Ebola May Place Entire Nation in Quarantine

Photo courtesy of CDC

Photo courtesy of CDC

By R. Alan Clanton, Thursday Review editor

(Originally published September 7, 2014) The recent outbreak of Ebola virus has been the worst since the hemorrhagic fever first appeared in 1976. More than 3500 people have been affected, and as many as 1552 deaths have been attributed to this year’s outbreak (the World Health Organization puts the death toll at just under 2000; but other health organizations place the figure at 1552).

Most of those infected by the 2014 epidemic are in the African countries of Sierra Leone, Liberia, Guinea, and Nigeria.

In tiny Sierra Leone, tucked along the Atlantic coast between Guinea and Liberia, the situation has become so dire that government authorities there plan to enforce a mandatory 72-hour shut-down of the whole country. Residents will be required to stay in their villages and towns, and remain in their homes. Most work will come to a stop, except for the necessary public services. Shops and markets will be closed, and the lockdown will be enforced by the military and local police. Medical teams and researchers—dressed in hazardous materials suits and airtight hoods—will comb through streets, alleys and neighborhoods, going door-to-door to administer tests and locate those who may be avoiding treatment, or those who may be hiding from authorities out of mistrust.

Sierra Leone’s planned lockdown mirrors one enforced by Liberia last month—a mandatory, at-gunpoint quarantine of several neighborhoods so that soldiers and medical teams could enter the area in search of people spreading the disease.

The goal in Sierra Leone, where the disease may be spreading the fastest, is to quickly identify those who may already have the virus in their system but do not yet have any of the symptoms. The lockdown will also give medical teams the necessary time to go door-to-door to provide treatment to those already sick—some of whom may be too ill to travel out of their homes, and others who are just afraid of the police or the medical workers.

But the lockdown in August stirred protests and even violence in Liberia, where police and military clashed with civilians. Some observers say that Sierra Leone’s proposed shutdown may trigger similar unrest.

Sierra Leone has scheduled its nationwide lockdown to begin on September 19. The lockdown will last for three days, but possibly longer if medical teams are unable to reach all towns and neighborhoods. Authorities want citizens to stay put in one place, since travel—even on foot—would cause confusion among the medical teams, and could easily spread the disease to other areas.

Some international and non-profit aid groups are concerned about the impact of a mandatory shutdown. Doctors Without Borders, and international medical organization, has stated publicly that it fears a gunpoint-style quarantine will make the situation worse, causing many in an already fearful and distrustful population to hide or evade contact with doctors. Such measures, the group said in a statement, result in “driving people underground and jeopardizing the trust between people and health providers.”

But Unicef, an organization funded by the United Nations, has endorsed the lockdown as the only reliable way to full vet the population and treat the sick.

The World Health Organization (WHO) has endorsed the use of blood transfusions as a means of combatting the spread of Ebola. But transfusions are expensive and require large amounts of blood, and as a result other medical research teams are hoping to identify a more reliable form of treatment. The symptoms of Ebola virus come on very quickly and drastically, and can include extremely high fever coupled with diarrhea and vomiting. The 2014 outbreak has led to a roughly 50% mortality rate, making it one of the deadliest epidemics in decades.

The Ebola virus is named for the Ebola River, which flows through parts of the Democratic Republic of the Congo (then called Zaire), where several small villages contained the first victims of the newly-discovered virus in 1976.

Related Thursday Review articles: Ebola, Magnified X25K; By Thursday Review staff; Thursday Review; September 5, 2014.

– See more at:

Writing on the Wall: Social Media the First 2000 Years (book review)

Writing on Wall

Book review by R. Alan Clanton, Thursday Review editor

Facebook recently celebrated its tenth birthday. The multibillion dollar company, founded in 2004 by Mark Zuckerberg, has grown to be one of the most valuable corporations in the world, and its sole product is information and data.

There is no drilling for oil, no laying of pipelines, no ships upon the sea, and no mining of precious metals. There are no bottled or canned drinks, no assembly lines making cars and trucks, no factory churning out toaster ovens, shower curtains or computer components. Just data—your data, the data your several hundred closest “friends,” along with the data of roughly 1.3 billion other people around the globe who use Facebook. And unlike other multi-billion dollar industries, from Coca-Cola to Microsoft, from Taco Bell to Koch Industries, Facebook spends almost no money advertising its service.

Further, Facebook has no rivals, at least not in the traditional sense. Coca-Cola competes with PepsiCo, Wal-Mart competes with Target, NBC News competes with ABC News. Facebook’s last real competitor, My Space, faded into relative obscurity more than five years ago. There are others out there, like Tumblr, Google + and Linked-In, but Facebook’s predominance over its quasi-competitors makes any comparison lopsided in the extreme. For the vast majority of computer users and smart phone users, the ubiquitous Facebook is a tool as important as one’s wristwatch or ones credit card. For some, it may be more important.

But is Facebook a game-changer in the long history of human interaction and communication?

A new book by Tom Standage, Writing on the Wall: Social Media, the First 2000 Years, argues that Facebook is merely one in a long series of human inventions designed to make the spread of news and the dissemination of information easier and more reliable. Facebook may be more user-friendly and more democratic in its power to engage, but it is a logical—indeed inevitable—merger of technology with the human need to inform and be informed.

Highly readable and instantly engaging, Standage’s book starts with an explanation of the ancient and entirely human belief in sharing news and information and telling stories about the human condition. At the core of social media is the more primitive concept of the social pack or societal unit, which served a useful and, as it turns out, essential service for its members: food, shelter, protection, family equilibrium, grooming. Facebook, in which the average user has roughly 130 friends, replicates with eerie precision the social networks of humans even thousands of years ago, when the average hunter-gather clan would top off at about 145 to 150 people. This is known as the Dunbar number, and it indicates the largest size of any community in which everyone could know with some intimacy everyone else in the clan, for above this number some people would be strangers to one another. Further, physical grooming was replaced with social grooming, in the form of news, gossip, storytelling, and social interactions designed to vet and filter information.

For this reason, Standage argues that the human brain is hardwired for social networking, with tens of thousands of years of fine-tuning all forms of direct and indirect communication. From cave drawings to stone tablets, from early hieroglyphics and the first systematic written languages, humans have sought to find the most useful ways to pass along critical information, as well as develop tools to develop ways to filter information for reliability.

Filtering and vetting information becomes of great importance as human history progresses and languages become more complex. And reliability of news and data also becomes critically important along the way as well, as humans must learn to sort out disinformation from truth, officialdom’s propaganda from balanced reporting and objective evidence. Think of Russia’s seemingly absurd campaigns of disinformation regarding the crash of Malaysian Airlines MH-17 over eastern Ukraine; or, likewise, its recent incursions into the Ukraine despite months of telling the world that vast military movements near the border were simply Army exercises.

Standage traces the lineage of mass communication and interpersonal dispatches from the time of the Greeks and the Romans through the invention of the printing press. The ancient Greeks invented and perfected outdoor graffiti as a form of interpersonal communication—writing on walls and creating newsfeeds—two and a half millennia before Facebook. Cicero used papyrus documents to present news and reviews, then, asked those who came in contact with the information to add their own commentaries and interpretations. Among Julius Caesar’s various contributions to social media: the development and founding of a prototype newspaper—hand-written, but copied by involved citizens and urged upon those traveling within the Roman lands. Today’s iPads, Kindle readers, Nooks and other devices—dazzling though their abilities are—nevertheless bear a striking resemblance to early clay and wax tablets, which were carried by hand or in bags.

Social media—as we understand it—is not new. It represents merely a thread of human interaction embedded deeply in our desire to understand our world, our community, and to connect to those closest to us. What has so radically altered the template has been technology, a tool which has allowed billions of people worldwide to connect using universal tools on computers and smartphones. What was once information spread and disseminated by hand, face-to-face, or in small groups—much the way the word of early Christianity was spread to hundreds, then thousands, then millions, starting with only a few dozen people—can now be sent to thousands within seconds. When Martin Luther sought to repudiate what he saw as a sclerotic, even corrupt officialdom in the church, he used nothing more elaborate than a list posted on a door—which in turn was copied, then copied again, by hand, in what amounted to a declaration gone viral.

Politics has often played a part in social media. The pamphlet and the handbill were early forms of proselytizing political views and societal struggles. Printed handouts were sometimes decisive in the cultural and political changes which swept France, Russia, Great Britain and the United States—thus literacy moves hand-in-hand, with political awareness and social advancement. Centuries before Dakota or Starbucks—with the Wi-Fi and the smartphone charging stations—coffee houses were used as a place to hold forth, compare ideas and ideologies, challenge conventions, and foment revolutionary ideas. Like the internet, Facebook, and Twitter, coffee houses were accused of breaking down social skills and encouraging an institutionalized form of wasted time.

In short, are our social media platforms—Facebook, Google Plus, Linked-In, Twitter, Pinterest—so radically different from the way humans have engaged for thousands of years? Or are they simply the logical merger of digital technology with the human need to connect and share.

Taken as a whole, Standage’s book is highly readable and moves very smoothly. Its only fault—minor, to be sure—is that some chapters seem to belabor his point well after he has made the point quite effectively. Still, it’s easy to overlook this indulgence since he tells the story of social media so well and with such striking comparisons. A fast, fluid, addictive read; and more relevant than a dozen other books on the great technological and business disruptors of our day.

Related Thursday Review articles: Beware the Siren Servers; book review of Who Owns the Future, Jaron Lanier; review by Alan Clanton; Thursday Review.

– See more at:

Will Slower Mean Faster for Net Neutrality?


By R. Alan Clanton, Thursday Review editor

This week—beginning on the morning of September 10—dozens of major internet and technology companies will protest a potentially slower internet by…well…slowing the internet down.

Spoiler alert: it’s a symbolic act meant to demonstrate, through carefully crafted but annoying animations, what the internet of the future will look and feel like thanks to a recent FCC ruling allowing some major internet providers to control the upload and download speeds of some services—that is to say those services which do not agree to pay premium fees or enter into partnerships with the big ISPs.

Companies participating in the protest will include Kickstarter, WordPress, Foursquare, Mozilla, Reddit, Vimeo and a dozen others. Starting Wednesday, users of these services will encounter animations which will seek to replicate the slow speeds web users can expect in an age in which a few internet companies have traffic control over the hundreds of other smaller companies.

At issue is a recent ruling by the Federal Communications Commission (FCC) which reinterpreted its 2010 rules, which back then required that internet providers insure free, unfettered lanes for all broadband internet traffic—regardless of content or context. After a series of court challenges, an appeals court later decided that unlike utility companies and telephone companies—for whom services are considered a public trust and an essential economic tool—cable TV and wireless phone companies do not fall into this same narrow category, but are instead a consumer-based, supply-and-demand driven business model. According to the appeals court, cable companies like Comcast, Time Warner, Verizon, Sprint and Charter can charge whatever rates these companies see as appropriate based on traditional market factors.

Shortly afterwards, the FCC agreed and essentially endorsed the view of the appeals court, triggering a wave of deals where companies dependent on bandwidth and high internet speeds negotiated agreements with the big cable providers and cellular companies. The most prominent of these quick arrangements was the deal between Netflix and Comcast, wherein Netflix would pay an undisclosed tariff for access to bandwidth.

Smaller web-based and tech companies say that the internet should be an open, unfettered superhighway—and access to its traffic flow should not be contingent upon a few large companies paying more for the fast lane while everyone else is stuck in traffic. But Comcast, Verizon, Time Warner, AT&T and others say that if they are paying to construct and maintain the highway, then they ought to be able to regulate not only the flow of that traffic, but also how much premium and non-premium users will pay. Think of those “fast-lanes” for visitors to Universal Studios in Orlando: some people pay more for the right to not stand in a one-hour line.

Proponents of web neutrality insist that a tiered structure for internet access will inhibit the growth and success of start-up companies, quash many forms of competition, and discourage technological development. It will also strangle innovation, since many forms of web innovation have emerged via experimentation and trial-and-error on the part of start-ups with little, if any, access to capital investment. In the future, a company’s success might be entirely dependent on how much it is willing to spend to insure unfettered web access. In this view, the quality of a developer’s product or the strength of its innovation would become secondary to the inventor’s ability to strike a deal with big ISPs.

Further, some fear that a non-neutral web would result in similar price-tiering for web users and most Main Street customers, not to mention small businesses. Some cable companies already charge a two or three-tried rate structure for internet access.

Freedom of the press advocates also express a separate concern that a non-neutral web could—and likely would—be used to limit access to information, to censor news, or simply to constrain or inhibit unpopular content or alternative points of view. Some fear even outright blockage of some websites.

The non-profit organization Fight for the Future has an online petition—already signed by hundreds of thousands of people. Other organizations have already gathered as many as one million signatures.

The general concept of net neutrality has its roots in technologies from previous centuries, most notably the telegraph lines constructed in the 1800s—lines which allowed for quick communication over hundreds of miles of wire. Telegraph lines were initially built by different companies in a variety of regions, but ultimately—though mergers and acquisitions—two mega-companies emerged: American Telegraph Company, and Western Union. Western Union won a government subsidy in 1860 to construct the first coast-to-coast lines, and upon the completion of those transcontinental wires in 1865, Western Union had considerable sway. By the end of the next year, Western Union had bought most of its rivals and had become a de facto monopoly. Its business model improved as it took great care and effort to insure that all wire communications were transmitted equally. However, Orton also revamped Western Union’s technologies to accommodate inventions by quasi-competitors (like Thomas Edison), and by improving service to his three key customer markets: businessmen and large businesses, which transmitted reports and data by wire; newspapers and magazines, which relied heavily on dispatches sent via wire service; and government officials and government field offices, which reported to regional offices or to Washington. In that sense, Orton’s incarnation of the telegraph service offered a tiered product line, as well as open and unfettered access to individuals.

Nevertheless, Congress sought to create a framework of neutrality for the telegraph though the Electric Telegraph Act of 1860, in which it wrote that “messages…from any individual, company, or corporation, or from any telegraph lines connecting with this line at either end of its terminus, shall be impartially transmitted in the order of their reception, excepting that the dispatches of government shall have priority.”

There are those who argue that net neutrality is a red herring and a distraction from the realities of the marketplace. Robert Pepper of Cisco Systems says that “supporters of net neutrality regulation believe that more rules are necessary.”

“In their view,” says Pepper, “without greater regulation, service providers might parcel out bandwidth or services, creating a bifurcated world in which the wealthy enjoy first-class internet access, while everyone else is left with slow connections and degraded content. That scenario, however, is a false paradigm. Such an all-or-nothing world does not exist today, nor will it exist in the future.”

Further, some large internet service providers have made the point that by creating a flat, featureless internet—without tired pricing and without specially arranged packages with companies in need of wide swaths of bandwidth—big ISPs will have neither the incentive, nor the capital, to upgrade services or invest in improvements. The big players argue that many companies—especially other large web-based services like Google, Skype and Facebook—freeload by using vast tracts of spectrum and creating massive online demand—without actually having to install even a mile of broadband cable, Ethernet, fiber-optic or coaxial wire.

In the meantime, the voices are raised largely in favor of a more pure definition of net neutrality. The question remains: will this week’s protest—deliberately slowing down the web—actually make things move faster?

Related Thursday Review article:  Net Neutrality: Is Some Web Access More Neutral Than Others?; Thursday Review.

– See more at:

Net Neutrality: Is Some Web Access More Neutral Than Others?


By R. Alan Clanton
Thursday Review editor

(Originally published July 11, 2014) Back in February of this year, when Brian Roberts, CEO of Comcast, announced Comcast’s proposed buyout of Time Warner, there was a storm of media coverage about what the merger of these two cable and internet giants would mean. The majority of the press narrative surrounded customer service: would such a mega merger be good for customers in the long run?

The overwhelming response by those who joined in the discussions regarding the massive merger: no. Most feared longer on-hold waits when calling for customer service or technical support, longer delays in the field during outages or service problems, higher rates and new fees, and—in general—abysmal service from two companies with already poor customer service rankings.

Some economists and business analysts looked at the equation from the standpoint of jobs: layoffs will surely follow in the wake of such a massive merger; more call center and tech support operations will migrate overseas; thousands of blue collar, white collar and office support people would be dumped into unemployment. Cynics said nothing much would change at all.

But some realists, and a few optimists, saw merely an inevitably shifting market with new opportunities: Other technologies would continue to spring to life, especially as younger consumers sought content in unorthodox and unconventional ways. Technical innovations would arrive—as they seem to arrive daily—giving us alternative tools to TV content and entertainment choices. Satellite companies would likely gain customers, and their increased revenue would give them more leverage to expand and update their own technological strengths.

But, at about the time that DirecTV was seeing a surge in new customers—mostly those engaged in a pre-emptive bailout from cable in Time Warner and Comcast areas—AT&T announced its proposal to purchase the satellite giant outright. The marriage of AT&T and DirecTV was, in fact, a logical response to recent Comcast acquisitions. Not wanting to be left standing without a chair when the music stops, other media and telecom giants are looking to shore up flanks and find new ways to remain relevant in an age in which technological changes create paradigm shifts almost overnight. Everyone is affected: Verizon, Apple, Amazon, T-Mobile, Sprint, Charter, Cox.

And remember Aereo? Its tiny dime-sized device—basically an antenna for receiving and storing TV signals—threatened to unravel the business model of both broadcasters and cable television. Aereo’s little device was so disruptive to CBS, Fox, ABC and NBC, that the contentious matter ended up in the Supreme Court, where justices recently agreed with broadcasters and declared Aereo’s antenna a tool for theft.

Aereo lost its case, but you can bet that there will be more digital disruptors and existential challenges very soon, especially as Facebook, Amazon, Apple and Google seek to grab more hours of our collective attentions. It was for this very reason that Comcast justified its need to merge with Time Warner, and those same imperatives apply as AT&T makes its case to Congress for the right to acquire DirecTV. The mergers will continue, and customer relations may suffer.

But lost in the hue and cry over customer service has been what some technologists regard as the more transcendent issue in these massive mergers: net neutrality.

Net neutrality, for those unclear on its passive, almost oblique language, is the basic philosophy that says that all internet traffic should be treated equally. Net neutrality is to internet content what Lady Justice is to the law: a formidable, stoic presence ensuring evenhandedness and fairness, and blindfolded to issues of creed, color, convention and context. Net neutrality means that your cable or internet provider will treat equally all traffic coming into your home: streaming movies, online games, access to banking or retail activity, music downloads, payment activities, photo uploads or downloads, television content, music videos and other short video material.

Advocates of a free and open internet regard its neutrality as crucial—a fundamental tenet essential to the free flow of information and the growth of innovation. And there are other comparisons employed. Just as your landline phone is neutral, able to make or receive calls in an unfettered landscape (unless you choose to block certain callers), so too should your internet access be unhampered. Just as your electricity flows into your home on an equal playing field with that of your neighbor, so too should your web access.

Originally a cherished and critical element in the thinking of the FCC and other Federal agencies, net neutrality has seen slippage over the last decade. In an important change of direction, the FCC in 2002 declared that the internet was more akin to an information service than a public trust. In that sense, according to then-chairman Michael Powell, your internet service provider (ISP) was more like a magazine or newspaper: you can subscribe to it—for a price—but there are no guarantees about its content, which comes at the discretion of owners, publishers, and editors.

That change of philosophy set in motion a slow but inevitable shift in the winds. But the tide has ebbed and flowed well into the late aught years. In 2010, largely as a result of complaints that Comcast was interfering with its customers’ web access and internet preferences, the FCC took a step back toward a policy of neutrality by reaffirming its view that the internet should be treated in the same way that government treated phone lines in the early part of the twentieth century: the web is a public trust, and its architecture serves as a “common carrier.” The FCC’s 2010 position was imperfect, and many technology advocates complained of the loopholes, but it was an important step in the direction favored by those who want to see the freest flow of information and content.

But early this year, a U.S. Appeals Court ruled against the FCC and in favor of Verizon; the case had centered upon Verizon’s arrangements wherein the mobile phone giant was charging additional fees to companies like Amazon, insuring those companies premium access (meaning faster speeds), and relegating other applications and services to a slower lane.

The issue has now become a political challenge facing the administration of Barack Obama—and probably the administration of the next U.S. presidents. Back in 2008, Obama had campaigned on a promise that he would honor net neutrality, but in practice he has done little to re-establish that central canon. Recent telephony and cable television mega mergers have made it abundantly clear that the time is ripe to establish a core value system for the internet’s rules-of-the-road.

That Appeals Court decision in the Verizon case opened the door to more preferential treatment, and by extension it created an avenue for more revenue for ISPs offering premium access. Comcast and Netflix recently came to an undisclosed arrangement whereby, for a fee, Netflix can stream its content across Comcast’s vast infrastructure without inhibitions on speed or quality. Likewise, AT&T enabled iTunes to have a special “fast lane” across its massive architecture, but only after Apple negotiated for the use of that specially tweaked speed and access. Meanwhile, services similar to iTunes, like Spotify, are left struggling with inferior access simply because they are unwilling or unable to negotiate preferential treatment.

Advocates of an open and neutral internet agree that what suffers most from a tiered arrangement is technical innovation. Smaller companies, unable to pay the premium fees charged by a large ISP like Comcast or AT&T, would face serious challenges to the development of their products and services, and some of those hurdles would be insurmountable without the same access to the internet as other companies. Worse, some fear that a pay-for-speed web would begin an organic process favoring almost entirely the biggest players—those with deep-pockets and little to fear from negotiating privately with big ISPs. Smaller players would be forced from the table quickly, and even those who survive would almost certainly be forced into shotgun marriages with other, larger firms—a cycle of larger companies buying out the smaller ones.

There is also the specific concept of “the last mile” of delivery. Without some form of regulation to maintain an open, unfettered internet, big ISPs become gatekeepers and key-masters in one stroke. Innovative web companies, tech start-ups, hardware makers, software designers, and content creators may have great products and services, but if a big carrier can establish a rate structure for content based on speed and reliability, the benefits and value of these products are lost.

Consumer advocates worry that tiered-price arrangements for content providers will increasingly translate into equally complex pricing and fees for subscribers and web users. In the near future, when Comcast completes its merger with Time Warner, at least one third of the U.S. will be inside the footprint of the Philadelphia-based cable company. That means that customers may experience the full effect of an internet that is anything but neutral. And some business analysts worry that the effects of a stratified U.S. internet could easily spill over into similar behavior in markets around the world. And it may also stifle technology in the U.S. while driving it into foreign markets.

The January court decision was not a total setback, however. Verizon had gone so far as to argue that the FCC had no authority to regulate broadband or wireless access, a position the court rejected totally. But the court basically ruled that the FCC’s 2010 working-position was akin to an overreach. The FCC, according to U.S. Circuit Judge David Tatel, should have more carefully linked its requirements regarding net neutrality to the concept of common carriers. The problem sprang from the FCC’s own bipolar thinking, and dates back to Powell’s 2002 interpretations of ISPs as information services. That 2002 working-position was a legal time bomb, and when in 2010 the FCC attempted to reassert net neutrality, the fuse started burning.

But experts say that now is the time to rewrite and retool the guidelines to better encompass the rapid technologies now unfolding around us.

Some Thursday Review readers have commented in the past about articles we have posted on this topic, especially in the context of recent mergers. One reader, someone familiar with the cable business from the inside, said that the original thinking on internet access was right all along—it’s just that we didn’t collectively see the metaphor as flexible.

“In the 1990’s we called it the Information Superhighway,” he said, “and advocates of a free internet understood that to mean just what that image elicited—lots of cars and trucks moving along at a moderate-to-fast clip. But what was not seen was an age in which there were special lanes for carpooling, lanes for buses, lanes for electric cars, bike lanes, wheelchair lanes, you name it. And there are toll-booths. Some people have speedpay, others pay by the month, some by the year.”

In other words, to paraphrase George Orwell, some web access is more neutral than others.

Verizon, Comcast, Time Warner, AT&T and other ISPs also make the case—and it is not unreasonable—that it is, after all, their capital and their labor which builds the highway. Should they not, under such circumstances, be able to charge more for use of the fastest lanes? And if Amazon and Netflix are willing to pay more for the fastest streaming available to Comcast subscribers, would not that additional revenue stream enable Comcast to reinvest in bigger, better, faster highways for everyone? Maybe.

But not so fast, some say. In that January 2014 court case, the decision was based on a three-judge panel. Even then there was dissent. Judge Laurence Silberman, writing a minority opinion, said that new innovations are always at work, sometime undermining conventional business models.

“This regulation [of internet speeds and reliability],” Silberman wrote, “essentially provides an economic preference to a politically powerful constituency, a constituency that, as is true of typical rent seekers, wishes protection against market forces.”

In the meantime, a billion or more people are now connected to the internet via broadband, telephone infrastructure, or through handheld devices. That number grows by millions worldwide every day. Even giants like Comcast, AT&T, Verizon, Deutsch Telekom, T-Mobile and others cannot keep up with that growth. Thus companies like Google, Facebook and Amazon enter into serious consideration of space hardware and drone technology to bring internet access to still more millions.

In the end, innovation suffers most. Without net neutrality, the web’s core value gets inverted. The internet has enabled tens of thousands of business start-ups to blossom, often with little or no investment of cash, and this influx of dazzling competitiveness has radically realigned the marketplace and reshaped the economy. The power of a neutral highway means that thousands of little companies have the same opportunities as the dozens of big companies, and this can sometimes have transformative effects (just look at Blockbuster, Borders and most major daily newspapers). That level playing field ought to be honored as central to a dynamic economy and a capitalist framework.

Thursday Review will have more on this important topic in the near future, and we invite our readers to send us their opinions and comments on this issue.

– See more at:

Taxis, Drivers & Digital Disruptors


By R. Alan Clanton, Thursday Review editor

(Originally published June 23, 2014) There are hardly any venues in which the term “digital disruption” applies more than in the worlds of cab drivers and delivery folk. Digital disruption, in case you’ve been asleep during recent years, are words to describe how technology shatters conventional business models and up-ends time-honored processes.

For example, Amazon has disrupted the book store and music shop model; in fact, some have argued, those disruptions have become lethal—killing Waldenbooks, B. Dalton, Borders and infecting Barnes & Noble with an inevitable slow death. The newspaper business, to cite another easy-to-digest case, has been disrupted to the point of no return. Journalism has evolved so dramatically—or has been battered so relentlessly—in recent years, that it has faced a hurricane-like existential threat.

But who would have thought that the simple, straightforward business of calling a cab, hitching a ride or getting directions could turn contentious?

Drivers of cabs and shuttles say that they are under assault, and it’s not from low tips or complaints of speeding. They are under attack from digital tools available on your smart phone and in your car.

A tech start-up company called Uber, which is also the name of the app, offers mobile applications which allow drivers and riders to find the fastest possible way to get from point A to point B. Uber also enables average Joes and Janes to arrange to pick up extra cash by sharing their rides with other people going in the same direction. Uber, which is based in San Francisco, is backed by several venture capitalists and investors who liked the idea of crowd-sourced driving and ride-sharing. Uber acts as an intermediary—linking drivers who are tied into its system with anyone who needs a ride. And there is no cash transaction, making it painless and risk-free: riders pay based on a formula which uses mileage and speed (generally, time) to bill the customer by credit card. Tipping is not required, and is—in fact—largely discouraged (the credit or debit card transaction doesn’t even allow for a tip; and if a rider insists on tipping it must be in cash). Drivers who work with Uber touch no cash and never see your credit card.

Popular from the very start, Uber has moved its low-cost operations into hundreds of cities in the United States, Europe and the British Isles. Uber faces only one small drawback in its model—it is pricey: in some U.S. cities is compares modestly to traditional cab services, but in some cities it costs more. But Uber promotes its service as more flexible, more comfortable and infinitely more user-friendly. Uber streamlines its business model to reduce cost, and incorporates a dazzling battery of high-tech tools and web-based applications to make the pick-up and drop-off package fast and seamless.

For riders and ride-sharing advocates with a penchant for nimble mobile phone technology, Uber is a no-brainer. And it has become a cash cow for Uber’s quiet investors. Uber’s pricing is demand-based: during slack times, the cost drops; during peak days or hours, it rises slightly. Market purists see this as street-level capitalism at its best: a driver has something to offer, and you have a need; in between there is a price, negotiated by algorithms, Google, and supply and demand. What’s not to like?

But hold on a minute. Ask a cab driver in London, San Francisco, New York or Paris what he or she thinks of Uber—or any of its aggressive new competitors, like Lyft and SideCar—and you are likely to get an earful. Blood pressures will rise, and the expletives will fly. To those in the cabbie world, these web-based start-ups with all their mobile technologies and crowd-sourcing magic are little more than another assault on the working man. Uber destroys jobs, renders a noble profession useless and may be illegal to boot!

In several countries, Uber is under direct attack from the traditional cabbies and their supporters among traffic and highway regulatory agencies. Uber, in their view, lets hundreds of non-licensed, non-accredited individuals act in the place of highly trained, business-licensed cab companies. There are legal fights in Australia, France, Canada and Britain, and protests have taken place in several European cities. In London, Mayor Boris Johnson—under pressure to declare Uber and its kindred competitors illegal—said recently he doubted any such blanket declaration or outright ban would stand up in a courtroom. In Toronto, city officials and police charged Uber with a long list of crimes—from operating unlicensed taxis, to operating a business without appropriate safety standards in place.

In San Francisco, Uber was charged with similar crimes, including infringement against California laws meant to regulate and tax limo services. In several U.S. states, Uber and its competitors have been hit with class action lawsuits; the charges—unsanctioned, amateur drivers stealing tips, thereby undermining an industry pay-scale based on gratuity. Just weeks ago, Virginia issued a blanket cease-and-desist order against Uber for its violation of state laws regarding the appropriate background check of drivers. In Massachusetts, the state declared Uber to be operating illegally because Uber’s use of Google and GPS system technology did not meet the state criteria for measurement or highway guidance (even though State Police now routinely use the same technologies in almost all of their patrol cars).

But the bottom line for cabbies in hundreds of cities is…well…the bottom line. Cabbies say that Uber is an economic threat, not merely a tool for disruption. If Uber is allowed to operate with few—if any restrictions—cab and limo drivers say, then dozens of other start-ups will eventually so flood into the marketplace of street transport as to render an entire profession useless. In tourism-based cities—London, Paris, Berlin, Sydney—where fragile economies still stagger under heavy post-recession conditions, cabbies may lose their livelihoods entirely.

Uber, which is the more formally arranged of the ride-sharing start-ups, faces its own intense pressure on its flanks. Lyft, for example, offers what is usually described as democratic-driving, or peer-to-peer ridesharing. Already operating in nearly 100 U.S. cities, Lyft is designed to link up drivers with riders, with little more than that as its function. Riders can simply donate using cash or credit card, or kick in a few bucks at a routine gasoline stop. In some cities, Lyft has become a ridesharing program of great popularity and ease-of-use. And its small cost is most easily offset by more riders. A vehicle with a driver and three passengers, for example, would be the most efficient way for the driver to defray costs of a trip from Point A to Point B. Within the Lyft community, riders are encouraged to “donate” appropriately based on time and distance. Lyft began as a small-distance spin-off of Zimride, founded in 2007, which facilitated ride-shares over longer distances and encouraged its users to network and build trust through social media. Using references online—and factoring-in things like reliability, friendliness, cleanliness, and timeliness—other users could establish ratings for other drivers and riders.

And like Uber, Lyft incorporates dozens of rapidly-evolving tech tools into its arsenal. Green advocates see Lyft as the ultimate ride-share program. Cabbies see it as a way to undermine their very existence: lots of amateurs simply hauling extra passengers for the cost of gas or the price of a latte. Cities and states continue to debate the complex mosaic of questions and issues: safety, security, insurance and liability, taxes and fees, even the appropriateness of tools being used. And a combination of city, state, and law enforcement see Uber and Lyft as mechanisms for dodging regulation and accountability.

Technologists ask the question: how do these apps and start-ups differ from street level negotiation of a simple trade? For example, when I was a student a Florida State University (this was many years ago, before the popularity of the internet or the advent of mobile apps), students would routinely post little notecards on “travel boards” in places like the student center or the campus post office. On any given Friday, one could simply go to that board—either to offer to share your car with another student and their luggage, or as a way to hook up with a driver. With one phone call, a driver traveling from Tallahassee to Tampa could link up with a student who wanted a ride in that same direction, and voila, the cost of gas could be instantly cut in half. An editor and author like Tom Standage, author of Writing on the Wall: Social Media the First 2000 Years, would agree that those ride-share bulletin boards were simply a rudimentary form of social media, which, in this case, expedited a basic transaction.

Uber originally promoted itself as a luxury service (thus the high-end price point), but the intense competition from crowd-sourcing services has forced Uber into more flexible products. Largely gone are the Lincoln Town Cars and the Cadillac Escalades, and it their place have come mainstream and intermediate-grade cars of every make and style.

As informal and democratic as Lyft seems, there is some structure. The service requires two to three hours of training, and mandates that its participants meet certain minimum standards: background check, criminal background search, vehicle inspections, no drugs or alcohol. Lyft also advertises that it insures its drivers up to $1 million for any serious incident, and its participants must maintain a minimum rating among all users. Drivers whose rating falls below the threshold are terminated.

Meanwhile, cabbies and their allies have become enraged in the face of dozens of start-ups—tech-based, crowd-sourced entities which threaten the very existence of the driver of vans, shuttles, limos and cabs. Protests in London, Paris, Madrid and New York have gridlocked traffic and brought attention to the economic struggle between the Old School and the New School. But some of the protests, despite the fact that some in local politics support the cabbies, has directed invaluable attention to Uber, Lyft and SideCar—free publicity that millions of dollars in marketing and advertising money could not have bought.

The cab drivers may, in fact, be curmudgeonly and gruff in the adherence to their time-warp business model, but some market analysts argue that the problem is not so much the fault of slow-to-react cabbies, but rather antiquated regulations and statutes. Laws written and approved in 1965 or 1973 have little relevance to either the streets or the dynamic possibility of technologies now been deployed on a daily basis. Editorial writers across the U.S. and Europe are suggesting that the only way to break through the ugly impasse is to rethink what is now possible—without sacrificing customer safety, or the safety of others on the road.

Uber, which owns not a single car or SUV, says its business model represents merely a way to connect people with data and information. Lyft says much the same thing, and even tosses in the money-saving aspect and the green benefits.

In the end, it is Old School versus New. If the newspaper and the bookstore are to serve as examples of the outcome, we invite you to place your bets now how this struggle will end.

– See more at:

Mustang: 50 Years of the Original Pony


By R. Alan Clanton, Thursday Review editor

(Originally published June 15, 2014) The match-up between the Ford Mustang and the Chevy Camaro is a classic rivalry. It’s like Coke versus Pepsi. PC versus Mac. Borg versus McEnroe. Bushes versus Clintons. Well, sort of. The automotive press recently reported that sales of the Camaro are slightly ahead of sales of the Mustang, but that margin is so narrow as to be pointless when it comes to bragging rights. Combined sales of the two cars totaled more than 20,000 in May of 2014, but Camaro beat Mustang by only 579 cars. That’s as close as a car rivalry can get.

But for Ford, bragging rights still come easily thanks to the Mustang. After all, it was the car that changed the landscape (at least the part with pavement) 50 years ago. And despite a nip-and-tuck battle with arch-rival Camaro, fans of the Ford Mustang are celebrating the anniversary of their favorite American car.

Though the article in the interior of the magazine My Ford describes its front end as “shark-like,” the visage shown on the cover is more canine—wolf-like, or perhaps predatory in the style of the panther. Maybe it is the intense yellow set against the black studio background. Whatever the effect, or the first impression, the photo of that 2015 Mustang evokes little to do with horses—save for that iconic logo on the grill and the rumbling sense of horsepower found in those muscular lines. No matter your interpretation of that bright yellow incarnation, there is animal power in the image.

Though the cover text does not mention the commemoration, the glossy spring edition of My Ford magazine clearly celebrates the 50th anniversary of what became arguably the most famous car of the second half of the 20th century, and the most durable American car ever created (sorry Corvette and Camaro fans).

Inside the quarterly My Ford magazine, readers will find six pages of description of the newest Mustang to hit the streets, along with comments from some of the designers and engineers responsible for the latest incarnation of the always-evolving horse, including the childhood recollections of Scot-born Moray Callum, now vice-president of design at Ford. As a wee tike in Scotland, Callum saw Mustangs race alongside European sports cars near Edinburgh, and already obsessed with automobiles, it was a game-changer for the budding young engineer.

The Mustang had a similar effect on thousands, even millions of other people worldwide, but most especially in the United States, where the car was introduced as an affordable alternative to what were—in those days—generally more expensive sports cars. Back in 1964, “sports” in automotive vernacular meant “pricey.” Muscle was part of the mix, but power was secondary to the panache and élan associated with the likes of Jaguar, Ferrari or the famous Astin-Martin driven by Roger Moore or Sean Connery in early James Bond movies.

For its first major appearance, Ford chose the 1964 World’s Fair in New York to unveil the little Mustang, and within weeks she became the most famous debutante in automotive history. The Mustang’s unveiling at the ‘64 World’s Fair is, in retrospect, a remarkable circumstance.

Built on 650 acres in Flushing Meadows, and constructed only after a complicated and sometimes contentious political fight (chief organizer was Robert Moses), the World’s Fair would eventually draw more than 51 million visitors during its two long seasons in 1964 and 1965. With the motto “peace through understanding” as its theme, the Fair that year was also an opportunity to showcase that great confluence of technologies and scientific development—the space age and the newly heightened space race between the Soviets and the U.S., and remarkable new achievements in computers, telephony, television and science. This was also the golden age of some of the most durable mid-20th century corporations—Ford, General Motors, Chrysler, IBM, RCA, Westinghouse, GE, all of whom became major sponsors with dazzling exhibits housed in futuristic buildings. Some of the things that visitors got to see for the first time, up close and personal, were computers (large and small), data punch card machines, analog tape storage, teletype and facsimile machines, the earliest phone and computer modems, and amazing new forms of TV and phone technologies.

Among the most popular exhibits that year were Walt Disney’s various pavilions and kiosks, which—in the vernacular of hindsight—were precursors to Walt’s vision of a much larger, east coast theme park campus, including an Epcot-like exhibit, people-mover and monorail demonstrations, and a “Great Moments with Mr. Lincoln” show which employed the same early animatronics used to great success a few years later in Orlando.

According to attendance records, Ford Motor Company had the second most popular attraction at the World’s Fair. The Ford exhibit included “Ford’s Magic Skyway,” which deployed 50 Ford convertible vehicles—all electric, no motors, no gas power—moving smoothly around a controlled ride through the Ford demonstration, a ride which offered an elaborate effort at time-travel (dinosaurs, wooly mammoths, volcanos, pre-historic cavemen and proto-humans) followed by a futuristic look at highway travel in the United States. The exhibit was designed by a combination of Ford engineers and Walt Disney designers.

Many of those “people-mover” type vehicles in the Magic Skyway attraction were open-topped Mustangs. Although it was unclear at the time, many automotive historians now say that the Mustang’s famous debut at the ’64 World’s Fair was a success in large part because of the confluence of technologies, iconic images and space age developments so closely linked to our memories of that era. In fact, the timing and venue of the Mustang’s debut could not have been more appropriate.

The Mustang had a long incubation period at Ford. As early as 1959, the top brass at Ford knew they wanted to develop something that would embody both sportiness and spunk. Seeking to compete head-on with companies like Jaguar, Astin-Martin and Ferrari, the first images of a prototype called “Mustang I” bear little resemblance to the car that would be introduced in 1964. Indeed, photos from Ford’s archives show designers Gene Bordinat and Herb Misch standing (and seated, respectively) with a 1961 Mustang I mock-up, a rocket-shaped coupe which looks more like a streamlined European racecar; no double-brow grill, no triple vertical taillight, none of the lines associated with the pony look. The quasi-Formula 1 mock-up is also a snug two-seater, and employs a spoiler-roll-bar hybrid wing only inches behind the head of the passenger and driver. Artist renderings created by John Najjar further translate the little roadster into a space-age racer, and the original proposals included a 90-horsepower V-4 engine. More working mock-ups were built, complete with working drive-train and engine.

The space-age roadster would also be expensive, and that was the deal-breaker for one top Ford executive. Lee Iacocca, then chief of the Ford division (of all Ford families of cars and trucks), vetoed the plans for the racy, rocket-like Mustang I. Iacocca had an intuition that Ford could compete with the sports car ideal—while also fusing a mix of economy (low cost to build, and a reasonable sticker price for the consumer) with a broadly defined American look. Iacocca wanted to create something affordable that would appeal to the American sense of identity, self-reliance and independence. Iacocca also wanted a car that would appeal to the young driver.

So the designers went back to work with Iacocca looking over their shoulders. And to fuel the need to get something new and dramatic developed quickly, he created internal competition—forming three teams all working in secrecy to come up with the best design. To save on cost, and to speed up the process, designers were encouraged to use the existing Falcon frame, building the Mustang atop what could be easily retooled along the assembly line. Iacocca appointed chief engineer Donald Frey as project manager, and insisted that Ford come up with something solid as quickly as possible. Eventually over a hundred designs were submitted, but the mock-up built by the team headed by Joe Oros was the winner, by far, and the clean-lined little car became the apple of Iacocca’s eye. “It was the only one that seemed to be moving,” Iacocca famously remarked. Iacocca was also fond of Oros’ design for its stand-apart quality—the Mustang seemed imbued with sportiness without looking like a cheap knockoff of the chic roadsters of Italy, Britain or Germany.

Black and white photos taken outside on a Ford parking lot in September 1962 reveal for the first time the breathtakingly familiar face and body of the infant Mustang. A full two years before its debut, there it was—that magical mix of attributes: the brow of the simple grill set forward between the now unmistakable, deep-set eyes; the triple vertical taillights on the sexy, taut behind; the lean styling and trim work which fused weekend sportiness with an American love of the road; and that iconic logo of the wild horse in full stride.

Oros’ 1962 version had beat out its closest rival, the “Allegro” design, which featured a double brow—just as shark-like as the Mustang featured on the cover of the spring 2014 My Ford magazine. But for Iacocca, the Allegro was a distant second. Still, there was internal debate at Ford, and another season of tinkering with designs—some more muscular, some more sporty, some heavier and pricier. But test marketing, early publicity in the automotive press, and pre-debut materials provided to dealerships convinced the Ford brass that their original design was on track, and the alternate mock-ups eventually all lost out to the basic Mustang as seen by the public in 1964.

Iacocca insisted that the Mustang remain affordable, especially for first-time car buyers. Engineers fitted it with a V-8 engine, but constructed the whole thing around a variation of the big-selling semi-compact Falcon, thereby combining power with a low sticker price. By October of 1963 the design had been narrowed even more, and after a few weeks of fiddling with the trim and other cosmetic elements, the pony car was ready for its unveiling in both a hardtop and convertible version.

Iacocca and his team had good instincts about what the public wanted in this new car. The Mustang was a major hit at the World’s Fair. Press reaction was not merely favorable (it doesn’t take much to get the automotive press to swoon upon seeing a new car design), but downright ecstatic. Ford had expected automotive writers and newspaper reporters to generate a few hundred articles based on Ford press releases and publicity material, but within only two weeks, more than 2500 articles had been written about the Mustang.

Linked with Ford’s powerhouse publicity campaign, lots of excitement on the part of dealerships nationwide, and all that good press, the Mustang sold over 100,000 units in four months, setting an all-time sales record for a newly minted model. Talk on Main Street and among car aficionados amplified Ford’s marketing efforts, and higher-than-expected attendance at the Fair did not hurt publicity. Dealers were reporting unusually large numbers of visitors to showrooms by people who just wanted an up-close and personal look at the car. With a list price of about $2350 for the hardtop and only $250 more for the convertible version, its affordability made it a smash success—so much so that many dealerships found it difficult to keep Mustangs on their lot.

Part of Ford’s strategy was to find the sweet spot in marketing: a “youth” car that might also trigger desires and yearnings of agelessness on the part of potential car buyers even into their 30s and 40s—all of this mixed with something within reach of Middle America and its increasing rates of expendable income. Where the Falcon had been Ford’s classic economy car—small even by U.S. standards—the high-altitude bird had quickly become a symbol of miserly thrift. As one writer put it, the Falcon became “your grandmother’s or grandfather’s car.”

The newly-minted Mustang was having none of that. Its very design was meant to evoke something fresh and energetic. Mid-1960s advertising—especially in magazines—reveals Ford’s overt outreach to people no longer defined as “young,” at least by the standards of that decade. Ford wanted to appeal to men and women in their 30s and 40s, who—approaching midlife—yearned for the material and symbolic trappings of youth. Whimsical advertisements touted the transformational power of the hot new car—geeks morphing into go-getters, nerds into dapper players, wallflowers into seductresses. An ad featuring the San Francisco skyline in the background touts the game-changing effects of a Mustang: Bernard was a born loser. He couldn’t win at Solitaire, even when he cheated. Enter Mustang—the car that’s practical, sporty, luxurious—your choice! Did Bernie’s luck change! Yesterday he won San Francisco in a faro game. Now he’s got his eye on New York. Mustangers always win. Another ad, politically-incorrect by today’s lights, featured a wonky, near-sighted librarian transformed into the center of attention for three dashing men in tuxedos.

Sales of the Mustang were so potent that it also transformed Ford as a business, thrusting it back on a level playing field with industry giant GM and catapulting it past the struggling American Motors and Chrysler. In April 1964 Iacocca appeared on the cover of Time magazine alongside a stylized painting of a fire engine red Mustang. The next week the Mustang appeared on the cover of Newsweek. Between the economical Falcon and the sporty Mustang, Ford had staked its dominance over two of the most important segments of the auto public.

Affordability was linked closely to versatility. The car could be easily ordered with a long list of options (even engine size), many more options than most vehicles offered by the big three. This meant that the sporty car could be marketed across a wide range of demographic groups and potential buyers: young families, single people in their late teens and middle 20s, early middle-agers, graduating students, workaholics, leisure-lovers and pleasure-seekers, those in search of luxury and those in search of economy. It could be marketed as economical and practical, and it could be sold as a car with muscle and flair. The Mustang was meant to look as much at home at the beach or the mountain cabin as it was at home in the circular drive of a posh mansion or in the carport of a middle class home. And it was meant to cut across the generational divide.

One ad which ran in magazines in late 1965 and early 1966 asked the rhetorical question: should a man in his 50s be allowed out in a Mustang? Let’s consider what might happen. To begin with, he’ll go around with a mysterious little smile on his face, and new spring in his walk. Mustang acts that way on a man…

By the end of 1965, the Mustang had sold more than a half million units in the United States and Canada, and by late 1966, it sold another 600,000. By early 1967 sales of the Mustang had topped all auto sales records for a new model. Ford designers and execs saw little reason to tamper with success, and other than a few small changes to the interior—including a new “padded” dashboard—and some tweaking of the exterior trim, Ford left the Mustang alone, and nudged the sticker price upwards by only a tiny percentage.

The Mustang was so successful that it spawned an immediate rival in the Chevrolet Camaro. GM wasted little time deciding that it would compete with the Mustang on muscle, and the Camaro was quickly marketed as a more powerful mid-priced sports car—a sort of Mustang on steroids. Ford’s reaction was to quickly offer the Mustang in big-block V-8s, with 1967 models pushing 390 cubic inches of engine. But GM’s introduction of the Camaro cut into sales numbers that would have almost certainly flowed toward the Mustang. Still other competitors followed, flowing into what would be dubbed the Pony Car category, with Mustang as the first. Besides the Camaro, there soon appeared the Pontiac Firebird, the Plymouth Barracuda and the Dodge Challenger. All would eventually stake out their claim among sports cars, but the Mustang lineage would remain unbroken.

Ford sold 356,271 ponies in 1967—still, a remarkably high number—but not anything like the huge sales during its first two years. In 1968 sales dropped again slightly to 310,000. Nevertheless, by early 1968, Ford could take bragging rights. The Mustang had shattered two records: most units sold for a debut car, and the most introductory sales of any Ford car since the debut of its own Model A. And by that point, the Mustang had become the biggest seller in car history.

Two people have been credited with coming up with the name Mustang. By some accounts at Ford, it was John Najjar, the designer of that first rocket-shaper roadster—and a huge fan of the World War II era airplane by the same name—who dubbed the prototype sports car Mustang. But the other version of the backstory is that the name was first suggested by Robert Eggert, a marketing executive with Ford. Eggert, who was familiar with horses and briefly bred them, latched onto the name Mustang after receiving a coffee table book on horses from his wife at Christmas. The historical debate was never settled within Ford, and the argument continues to this day among Mustang enthusiasts.

The Mustang became popular in motion pictures. Perhaps its most famous role was in the 1968 action/cop thriller Bullitt. In that film, Steve McQueen (playing alongside Robert Vaughn, Jacqueline Bissett and Robert Duvall) plays Lt. Frank Bullitt, a hard-hitting, gritty San Francisco detective. In real life, the character of Bullitt was based loosely on real life S.F. detective Dave Toschi. The movie features several famous chase scenes, and plenty of tire-screeching as Steve McQueen leaves rubber behind on San Francisco pavement. One long chase scene in Bullitt Bullitt movie still is generally considered by film historians to be one of the greatest high-speed sequences ever filmed—second only (and this is argued frequently in bars and at parties) to the famous chase in The French Connection. Ford loaned Warner Brothers several identical V8 Mustang GT fastbacks for use in the movie, with the expectation that the scenes would generate lots of excitement, and the loaners were further modified by race car driver/mechanic Max Balchowsky. Balchowsky added heavy duty suspension, beefed up the engines to produce both horsepower and extra roar, and supplemented the brakes to ensure that McQueen and the stunt drivers had the ability to stop the speeding Mustang in the sometimes tight quarters of San Francisco streets and alleys.

Mustangs feature prominently in other movies as well, including Back to the Future Pt. 2, American Gangster, and Diamonds Are Forever (a rare case of a Bond movie featuring an American sports car). And of course Ford’s earliest desire to compete with the British and Italian roadsters met with ironic success in one of the Mustang’s earliest cinema appearances, when a gleaming white 1964½ Mustang convertible appears in Goldfinger, in this case driven by woman intent on murdering James Bond.

The long life of the Mustang has been a complicated process of success and failure, upsizing and downsizing. After several years of continuously making the car bigger, a direct result of Ford’s desire to stay ahead of its most fierce competitors—the Camaro, the Firebird, the Challenger—Ford decided to tip the scales back in the other direction. By the mid-1970s the Mustang began to get smaller. The new compact version became, in essence, a second incarnation. By the end of the 70s it’s compact size and lean, sometimes austere styling created a divide between the acolytes of the early Mustang—with its classic lines and iconic profile—and those who preferred the so-called “new breed.” The 1979 model year dispensed with much of the DNA of the original 60s styling. Round headlights with replaced with four double rectangular eyes, and the front and back ends were reshaped to reflect the horizontal lines and single-level bumpers popular at the decade’s rollover. Mustang enthusiasts called this front end design “Four Eyes.” Gas mileage was an issue for the driving public, and the Mustang became lighter, and shorn itself of muscle. By the beginning of the 1980s the Mustang had been through three major phases, all based largely on size and power. The third generation was built on what was called the “Fox Platform,” and had in common its cousins in the Ford family—the Mercury Zephyr and the Ford Fairmont.

And though this Third incarnation of the Mustang had its fans and advocates—including those who even today regard this period as producing a classic—sales began to slump, in part because of fierce competition from Japanese automakers. Toyota, Honda and Datsun were finding huge success in the North American markets. But Ford wasted little time grousing about the influx of fuel-efficient cars from overseas, and instead tightened the efficiency of the Mustang (along with several other Ford products). By 1986 Mustang sales had climbed back up to 224,500, indicating to the top brass at Ford that the car was back on track. (Iacocca had famously moved to Chrysler, and was CEO of the Chrysler family of products).

The more muscular version of the Mustang, with its “5.0” V-8 and 200 horsepower, even became popular with police and law enforcement. Highway patrols in several states purchased fleets of modified pony’s to have on hand as pursuit vehicles. By the end of 1988 sales of the Mustang remained strong.

As Mustang enthusiasts know, there followed a fourth generation and a fifth generation of Mustangs, which bridged the decades from the early 1990s through the end of the zero years. The car would begin, in the early to mid-aughts, a slow reinvention of its earlier form. Fifth generation Mustangs were especially notable for their striking homage to the ancestral visage. Rectangular headlights were phase out and replaced with the round ones so familiar on those earliest Mustangs, and the triple vertical taillights, the deep set eyes inside the double-brow, and other styling elements—all part of the Mustang DNA—worked their way back to prominence. Ford even began adding tre cool retro elements to the dashboard. Sales continued to rise, then, occasionally fall. But Mustang enthusiasts liked what they saw, despite intense new pressure from arch-rival Chevy Camaro toward the end of the aught years.

Economic pressures sent all cars sales down in the Great Recession, and sales of the Mustang—like almost all models—sagged considerably. Along with other automakers, Ford suffered mightily between the end of 2008 and the beginning of 2011. Unlike General Motors and Chrysler, Ford opted out of taking money from the Federal government when the recession sent Chrysler to the very edge of the abyss and pushed GM into bankruptcy.

Now, the durable Mustang begins its Sixth incarnation, in part to celebrate the car’s anniversary, in part to kick sales numbers upward. The price of the Mustang is now ten times that of the original standard edition, but adjusting for inflation and rapid changes in auto design and engineering, that price rise is not only reasonable but modest.

Blue MustangImage courtesy of Ford Motor Company

The newest Mustang, the 2015 model now promoted in Ford marketing material but not yet available at dealerships until fall, has already drawn equal parts accolades and complaints. The added interior space (at 84.5 cubic feet the 2015 model adds more legroom, front and back), a slightly wider body, options for three engine types, and independent rear suspension all add to the car’s value and look. But a few Mustang enthusiasts complain that the new trapezoidal grill, restyled front and rear ends, and other design changes all add up to some loss of the trademark Mustang visage, and that new newest version breaks the genetic code.

Time will tell if Generation Six carries the Mustang legacy forward, and Ford will get to see if this dramatic gamble pays dividends when the car hits showrooms in early November.

Editor’s Note: Were you ever a Mustang owner? If so, send us your stories about your pony car and we will print it here at Thursday Review. Just email it to us at, or at

– See more at:

Can Angkor Wat Teach Us About Water Management?


By Earl Perkins, Thursday Review associate editor

(Originally published May 4, 2014) Man has experienced problems with water management systems dating to Ancient Rome and beyond, but archaeologists say those issues may have also driven residents from Cambodia’s famed Angkor Wat temple complex 1,200 years ago.

Airborne laser technology (lidar) uncovered roadways and canals, producing a detailed map of a vast cityscape which reveals a bustling ancient city linking the complex, according to a peer-reviewed paper released by Proceedings of the National Academy of Sciences journal in June 2013.

Angkor Wat is Cambodia’s top tourist destination and one of Asia’s most spectacular landmarks, constructed in the 12th century during the Khmer Empire. Cambodians are extremely proud of the temple, placing it on their national flag and having it named a UNESCO World Heritage Site.

Those high tech airborne lasers show numerous highways and previously undiscovered temples in the city known as Mahendraparvata, which archaeologists had suspected lay beneath a canopy of dense vegetation. The site is located on present-day Phnom Kulen mountain in Siem Reap Province.

“No one had ever mapped the city in any kind of detail before, and so it was a real revelation to see the city revealed in such clarity,” said Damian Evans, University of Sydney archaeologist and the study’s lead author. “It’s really remarkable to see these traces of human activity still inscribed into the forest floor many, many centuries after the city ceased to function and was overgrown.”

Researchers loaded equipment onto a helicopter in April 2012, spending days crisscrossing the forest from 2,600 feet. In 20 hours of flight time, they covered 370 square miles of terrain, studying Angkor and the two nearby complexes of Phnom Kulen and Koh Ker. Their findings were later confirmed by Australian and French archaeologists who slogged through the thick vegetation on foot. Researchers had previously spent years doing ground research and excavations mapping a 3.5-square-mile section, but the lidar revealed a 14-square-mile downtown which had a larger population than previous estimations.

“The real revelation is to find that the downtown area is densely inhabited, formally-planned and bigger than previously thought,” Evans said. “To see the extent of things we missed before has completely changed our understanding of how these cities were structured.”

Archaeologists are unsure exactly why Mahendraparvata’s civilization collapsed, but some theorized water management issues may have driven out residents, he said.

Researchers are anxious to begin excavating the site in the near future, seeking clues concerning those who lived there. They will recover and analyze material and environmental data left behind, including artifacts, architecture, biofacts (eco-facts) and cultural landscapes.

The medieval Khmer Empire traces its origin to Jayavarman II, who proclaimed himself King of the World in 802 CE. History shows the great ruler may have jumped the gun slightly, noting several centuries passed before the Khmers eventually built Earth’s largest religious monument. Angkor Wat became the crowning glory of a kingdom that by the 13th century spanned an area of approximately 1,000 square kilometers, located today in northwestern Cambodia. The vast urban landscape is hidden in Kulen’s jungle and in lowlands surrounding the temple.

The laser imaging reveals a cityscape at the heart of the Khmer Empire (9th to 15th centuries CE) that was more sprawling and complex than previously thought, leading archaeologists to consider the possibility that climate change and the kingdom’s sprawling waterworks made the complex unlivable. Angkor was considered to be the most extensive city of its type in the pre-industrial world, with its waterways and reservoirs vital to produce enough rice to sustain hundreds of thousands of inhabitants. At its height, the empire covered much of modern-day Cambodia, central Thailand and southern Vietnam, and the lidar information “is astonishing,” according to Roland Fletcher, a university of Sydney archaeologist and member of the international team. “We found the great early capital of the Khmer Empire,” he said.

Their research in recent years shows Angkor’s waterworks began breaking down as the kingdom faded into history, which can probably be traced to decades-long mega-monsoons and droughts in 14th century Southeast Asia (according to 2009 tree ring data), Fletcher said. “Things are going wrong by the 1300s.” Massive sand deposits in canals and spillway ruins the Khmers may have ripped apart were red flags for researchers, he said.

“The discovery of this early Angkorian city is a very exciting example of lidar’s use in the region,” adds Miriam Stark, an anthropologist at the University of Hawaii, Manoa, who has recently started conducting research at Angkor but wasn’t involved in last year’s investigation.

The lidar research shows medieval settlements at Phnom Kulen and Koh Ker had extensive hydraulic engineering on a scale comparable to Angkor, showing a much wider reliance on water management systems “to ameliorate annual-scale variation in monsoon rains and ensure food security,” the team reports. Some readings uncovered cryptic coil-shaped rectilinear embankments covering several hectares near Angkor.

“It was an unbelievable surprise,” Fletcher said. “Nothing like them had been seen before in Khmer architecture.” They may have had some role in farming, but the team cannot say for sure what their function was. Also the lidar data showed “very serious” erosion in parts of the ancient city, accounting for deep sand deposits found in excavations, Fletcher said.

The UNESCO website describes Angkor and its wider footprint “as one of the most important archaeological sites in Southeast Asia.” UNESCO is seeking to establish a comprehensive program to balance the vast historical importance and cultural significance of the huge site with the always-increasing pressures of tourism. Some UNESCO representatives are concerned that the nearby development of large hotels, huge restaurants, shops and other tourism-related construction could disrupt the water suppply and the water table, eventually causing severe structural damage to the ancient site. According to the British news site, The Independent, Angkor Wat receives over 3000 visitors in a typical day, making it one of the world’s busiest tourist attractions.

– See more at:


U.S. Space Travel Without Russia?

Falcon 9 Launch January_crop

By R. Alan Clanton
Thursday Review editor

Published May 15, 2014: With U.S.-Russia relations at a low point for cooperation—and many would make the argument that the tension between the two nations is at its worst since the Cold War—the future of the International Space Station is now in serious doubt.

Tensions over the Ukraine have led the U.S. and some of its partner countries to enact economic sanctions against Russia. In response, Russian President Vladimir Putin has tossed about some of his own economic actions, including the threat of cutting off desperately needed oil and gas to Europe. Few of Russia’s tit-for-tat sanctions would actually have a direct impact on the U.S., however, aside from the ripple effect caused by oil price increases worldwide.

But Putin has one ace up his sleeve which, in fact, does present an immediate problem for the United States and a few of its technological allies: for the last decade or so the U.S. has been largely dependent upon Russian rockets to get American hardware and U.S. astronauts into space. The decommissioning of the shuttle program in those heady days of cooperation between Moscow and Washington meant that the U.S. could save a bundle of cash by letting the Russians handle the heavy-lifting of travel into Earth orbit. At the time there were few—if any—military or techno thinkers who foresaw the kind of political and military trouble now spilling outward from the Ukraine.

Furthermore, several U.S. companies, including Boeing and Lockheed-Martin, have been using rocket engines built in Russia for several major defense department projects. Why not American rocket propulsion systems? The Russian-made engines are cheaper and ready for use inside American missiles and rockets, or so the logic went until a few months ago.

Force majeure, as they say in the law. That was then; this is now.

The U.S. Air Force (along with other agencies) very badly needs to get some cutting-edge hardware and gadgetry into space. NASA has neither the funding nor the capacity, and few U.S. allies have space programs which are fully operational and at-the-ready.

So, after being routinely bypassed by the Pentagon and numerous U.S. government agencies, Space Exploration Technology, owned by billionaire Elon Musk, is now feeling the rush of vindication. The Air Force has put all of its leverage and resources behind getting Space X fully certified for the purpose of getting military hardware and spy satellite swag into space.

In late April, Musk had even filed a lawsuit in the U.S. Court of Federal Appeals contending, among other things, that a partnership between Boeing and Lockheed-Martin, called United Launch Alliance, was little more than a monopoly with a cushy, exclusive contract with the Pentagon for the long term EELV project (Evolved Expendable Launch Vehicle). The Space X website contains a press statement in which that joint venture is described by Space X as “on a sole-source basis without any competition from other launch providers.”

“Space X is not seeking to be awarded contracts for these launches,” the statement says, “we are simply seeking the right to compete.”

Musk told Congressional members in March that the Pentagon has been essentially supporting a two-partner monopoly by using only the Lockheed-Martin/Boeing cabal at a time when others should be invited to the table to offer their own technologies and make bids for space flights.

Now the Air Force is so willing to work with the California-based Space X that the Pentagon has an entire team of experts devoting their entire workdays to getting Space X certified for a variety of military launch applications.

Musk heads not only Space X, but also Tesla Motors, a firm devoted to creating workable, low-cost fuel cell, battery and high-tech cars. Musk told Congressional leaders that “space launch innovation has stagnated [and] competition has been stifled” as a result of the collusion between top Air Force brass and the partnership between Boeing and Lockheed.

The Air Force now hopes that it can work with Space X and forge a partnership which may bring about new flights using the California-based company as early as 2018. Space X has used its Falcon 9 rocket on three previous occasions to deliver materials to the space station, each time using its Dragon spacecraft atop the rocket. The Dragon capsule is equipped with a large payload area (23.5 feet in height and 12 feet in diameter) specially designed for cargo.

Meanwhile, some in Congress are asking NASA directly: how do we proceed if we do not have Russia as a partner in future space projects? Sanctions cut both directions over the last few months, and Russian deputy Prime Minister Dmitry Rogozin announced earlier in May that his country would halt any further sales of Russian-made rocket engines or boosters to the U.S. or its partners. Russia has also threatened to withdraw completely from the space station program, putting the future of the civilian and science project at risk.

With its funding cut deeply at the end of the last decade, NASA ended manned space exploration in 2011. Space X has been working with NASA for several years, providing launch services for satellites and other payloads.

The U.S. government also has open competitive arrangement with several companies—Space X among them—to develop low cost, innovative rocket systems for shuttling astronauts and supplies into space. The other companies who have been asked to develop technologies for space travel include Boeing, Sierra Nevada, Blue Origin, and Amazon.

Related Thursday Review articles:

Space Bots; Thursday Review staff; Thursday Review; March 12, 2014.

– See more at:


How a College Library Thrives in a Digital Age

Photo: Alan Clanton

Photo: Alan Clanton

By R. Alan Clanton
Thursday Review editor

Bainbridge, Georgia is like many small towns across the Deep South. The busy east-west main street through town sums up the diversity and complexity of the city, with three or four blocks of stately mansions, rows of azaleas and camellias in full bloom, and ancient live oaks—followed, almost immediately, by a busy stretch of road lined with retail shops, fast food restaurants, real estate and law offices, automotive stores, and shopping centers.

But driving a little further east through town on U.S. 84, just past the point where the bypass meets the main road, one will come across the campus of Bainbridge State College, where students from an equally diverse mix of background and culture come to learn and study.

If you are a traveler simply passing through Bainbridge, it would be easy to miss the campus were it not for the imposing Charles H. Kirbo Regional Center, a multi-use facility equipped for conferences, musical events, speaking engagements, meetings and special events (the facility recently hosted a lecture by former President Jimmy Carter, for whom Kirbo was a close associate and advisor). The auditorium wing of the building sits close enough to the road to make it impossible to miss the fact that there is an academic campus nearby.

Once inside the campus, which is heavily shaded with oaks and other trees, you realize that the college is much larger than it appears from the highway, with handsome buildings constructed in a large square around a vast center green space, and parking areas neatly arranged in a decentralized layout. On the north side of that grassy commons area is the school’s library, and like most campus media centers, large or small, it is filled with students busily at work. On the day of my arrival, it was quiet, with about half the tables and workstations taken.

The library at Bainbridge State College is typical of most media centers found on campuses all across the country: students can be found at work at a variety of tasks: engaged in research for reports and term papers; cobbling together key elements of class projects; sitting at personal computers culling through online sources for book reports or reviews of current events. Many students are at work with laptops as they prepare papers or complete exams and quizzes, while others work on homework assignments.librarian helping a student

Still other students, wearing headphones, are immersed in the study of foreign language, and some of those students are using computer applications like Rosetta Stone, which—since it requires the student to engage in verbal responses—means that young people can be found hunkered down in small, glassed-in rooms where their spoken responses will not disturb others in the library.

A brief walk through the BSC library, down the main aisle which divides the traditional rows of bookshelves from the open areas designated for study and reading, reveals a wide, 8000 square-foot glass-enclosed section, recently added, which extends airily into the campus’ green spaces and into the shade of those stately oaks and tall pines. On the day of my visit, only a few students were using this attractive space, but it was easy to see (at least from my perspective) that the area was ideal for reading, studying or essay composition. (Yes, I am a library nerd: within about one minute of arriving in the recently added space, I had picked out what would have been my “favorite” table for studying or writing).

Friendly disclosure: before I spent the last 30 years working in media, mostly in print journalism, television production or cable TV, I worked in a library—for eight years, in fact. It was a big public library, brimming and bustling with activity, but with the majority of its foot-traffic circling around the all-important main reference desk and the inescapable, handsome, maple card catalogue cabinet which sat in the center of the main floor. Most libraries, large or small, still look this way, except that you will be hard-pressed to find an actual card catalogue. They turn up now in antique stores, where folks pay premium prices to have these vintage cabinets in their homes and offices.

The demise of those durable card catalogues in libraries is just a small element in a world being transformed by computers, the internet and big data.

There is no stopping the conversion of information to the digital realm. The process began in the 1950s with the large, mainframe computers built by companies like IBM. Much of that data was stored on large reels of analog tape. Digitization accelerated somewhat in the 70s and 80s with the arrival of ever-more-inexpensive forms of business and personal computing, along with cheaper ways to computerize data in offices, retail environments, government offices and academic venues. The upward curve grew more frenzied by the 1990s as the price of computing came down even more, and as millions of people worldwide migrated toward the internet. But even then, the vast majority of data and information was largely analog, and libraries were no exception.

By the start of the new millennia, the pace of digitization became feverish.

In their 2013 book Big Data: A Revolution That Will Transform How We Live, Work, and Think, authors Kenneth Cukier and Viktor Mayer Schoenberger stress just how much the transformation has accelerated just within the last decade.

“As recently as the year 2000,” the authors write, “only one-quarter of all the world’s stored information was digital. The rest was preserved on paper, film, and other analog media. But because the amount of digital data expands so quickly—doubling around every three years—that situation was swiftly inverted. Today, less than two percent of all stored information is nondigital.”

By the beginning of the aught years, there were wild and hyperbolic predictions of the death of print, and, by extension, the inevitable obsolescence of the library. But to paraphrase Mark Twain, those reports were greatly exaggerated.

Though the transition to the digital age has been a costly, awkward and sometimes lumbering process for many media centers in the United States, in some ways the library at Bainbridge State College got there first. BSC was the first academic institution in Georgia to convert its microfilm and microform records of the local newspaper (The Bainbridge Post-Searchlight) to a fully digital format. Previously, BSC’s back issues of the Post-Searchlight and older local papers were found on 111 reels of analog film, which, like most libraries in the pre-digital age, meant its users needed to view images of newspaper pages using a large, expensive device designed to read formats like microfiche and various flat or rolled film storage technologies. When BSC completed the conversion to digital reel in 2007, its first-in-the-state achievement made headlines, including those of the Post-Searchlight.

Students can now access older editions of The Post-Searchlight (or its Bainbridge predecessors) using a computerized database familiar to anyone even moderately comfortable with computers. Other than a few gaps—normal after a century or more—the BSC’s digital newspaper archives can access past issues of the Post-Searchlight (or its predecessor, The Bainbridge Argus) as far back as 1869. On the day of my visit, Library Director Susan Ralph assisted a student in pulling up back issues of The Post-Searchlight which dated to the early 1980s.

Another element immediately apparent at BSC’s library: security. Like many newer media centers, and some older ones easily retrofitted for the task, BSC has a walk-through security system near its entry and exit areas. Its modest purpose is to prevent theft—intentional or accidental—of library materials. But like many campuses, there is also the more serious issue of safety and security for the students themselves. Another staff member gave me a brief tour of the main reference desk, and one of its prominent features was a full-sized monitor upon which was displayed high-resolution video images of almost all areas inside the facility. At any given moment, librarians and staff can easily monitor what’s happening with a simple glance at one of the live images on the multiscreen display.

Computers are now a fact of life in libraries—both for internal purposes, and for the benefit of library users.

And in an age in which so much data and information is now online, the library at BSC has large areas devoted entirely to computers and web access. Sitting at individual workstations, students can search the internet for materials related to their studies—newspapers and magazine websites, trade or professional journals, electronic news sources, and online databases and research websites. On my visit, I saw students reviewing websites related to nursing, physical therapy, transportation and logistics, and early childhood education.

At numerous other workstations, students have the option to make hard copies of papers and class materials by using printers located nearby. Scores of the computers are equipped with the usual battery of Microsoft Office products for word processing, spreadsheet management and other tasks—an obvious benefit to students who may not have access to a computer at home. Students can compose book reports, term papers, essays and other written projects, and print copies of their completed work with a keystroke. Unless the quantity of printed pages is excessive, the use of library printers is already covered by student fees appropriate to their coursework. BSC also employs a tech support person, available during most hours of operation, to contend with the myriad of potential issues faced by both library staff and students—hardware or software problems, internet disconnections, application failures, glitches with screens, keyboards, servers or routers.

Still, no amount of conversion to digital replaces the printed book. Like most campus libraries, the floor space at BSC’s media center contains a vast footprint devoted to rows of shelves filled with books, thousands of books (about 45,000). And like most academic libraries in the U.S., these books are arranged using the Library of Congress Classification system (most public libraries, and some public school systems, use the Dewey Decimal Classification system).

And that brings us back to the matter of the vanishing card catalog.

Digital databases and ever-advancing search engine capabilities make finding a book relatively easy, which is arguably the most transformative change in the 21st century library—much in the same way that the search engine has transformed how much of the world’s population thinks and acts when in search of information (for better or worse). Indeed, the card catalog’s inevitable obsolescence was made even more irreversible by the search engine’s singular ability to locate specific types of data within seconds. The more specific the request, the more narrow the results—sort of. Go to the search window of Google or Yahoo and type in, say, small business ideas, and you will get over one billion results, “ranked” more-or-less by the frequency of clicks by all Google users worldwide, and an indication of not only how much data linked to that is now available, but also how broadly the search request harvests items related to that query. Type instead small business ideas using noodles, and the results narrow greatly, to under three million. But type small business ideas using noodles from Croatia, and the results widen again to over 16 million. Adding “Croatia” does not help to narrow the search. Thus the paradox: less is more, unless it is too little.

This is where libraries have the edge. Searching for information in the library can be a refreshingly targeted process, especially if you know exactly what kind of information you want. Therefore the college library—and the library at BSC is no exception—still requires some nuance and out-of-the-Google-box thinking on the part of students.

BSC participates in vast system called the GALILEO Interconnected Libraries (GA Library Learning Online, or GIL), an education intranet database which connects all 35 college and university libraries in the state. Using GIL, one can access articles in magazines, academic journals, trade and professional journals, books, and thousands of other kinds of research data—from any library within the GIL system.

I watched as a library specialist, Kaye Guterman, worked with a student to show him some of the finer points of searching for information using GIL. Not long after he signed in to the database (using his assigned username and a password), she asked him what subject he wanted to research. He chose coaching, then, narrowed it to high school and football. Still, that search request brought up thousands of results spread out across dozens of categories—sports medicine, sports records and game stats, motivation and self-help, biographical and autobiographical, along with educational data. But with a little prompting, she was able to counsel him in the art of fine-tuning his search to find materials most relevant to his topic. Soon, he was able to see an easily-ranked list of journal articles and books relevant to one of his preferred career paths: coaching high school sports.

GIL can give more-or-less instant access to articles and web-based research, Librarians helping studentsbut it does not necessarily guarantee that books listed are within a 60-second walk within the same library. But because the GIL database also includes all books in its huge statewide system, users can request delivery of books and other items not found in their closest library. Using a network of couriers, books can be quickly pulled from one library and delivered to another, often within 48 hours or less.

In other words, students need merely to have a working knowledge of how to use the GIL system, and, then, plan ahead to request a book which may take a day or two to arrive.

GIL does not entirely replace a much older, largely national program called Interlibrary Loan, in which books can be shared between public and academic libraries, typically using traditional forms of delivery such as the postal service. The library at BSC is a member of both the Georgia Online Database (GOLD) and the Southeastern Library Information Network (SOLINET). But GIL, through its dynamic, mass statewide database, speeds up the process and—in the long run—saves millions of dollars by reducing duplication of materials.

GIL also has the secondary advantage of giving librarians a real-time, user-friendly tool for understanding and developing their own collection: books or materials in high demand can be purchased for the home collection; books in less demand can remain accessible through GIL. This takes the guesswork and crystal ball-reading out of library acquisition (an expensive process), and greatly reduces money spent on books which may spend the next few years of their shelf life collecting dust. This real-time data also means that students need not suffer from a lack of appropriate research material because of missed cues on the demand side of the equation. Like most states still struggling to recover from the budgetary effects of the Great Recession, this makes interconnected data programs like GIL a win-win. State legislators love win-wins.

GIL is also constantly expanding to include more digital materials and the resources and books of libraries outside of the strictly academic environment; GIL includes not only the 35 college libraries, but also the combined materials of technical colleges, the Georgia Department of Archives and History, and other historical and research centers. (GIL’s main website includes a listing of all participating libraries, complete with links to library websites, library hours, locations and maps.)

And like much of our search-engine world, GIL employs a quick and easy way for users to check the status of their requests, retrace previous search requests, and painlessly renew a book which may be approaching its due date.

The BSC library also offers a self-service program called LibGuide, a carefully developed set of reading lists and study guides formed through collaboration between librarians, instructors and department heads. This gives students instant and round-the-clock access to a full range of appropriate research material crafted specifically for a student’s coursework.

Another high tech tool indicative of the transformation wrought by computers and social media: during regular library hours, students can use an online chat service to talk directly to a reference librarian.

Editor’s note: This is the first of a multi-part series about libraries in the digital age; future segments will include a look inside public libraries (large and small), historical archives and collections, and other academic libraries.

– See more at:

First Among Cosmonauts


By R. Alan Clanton
Thursday Review editor

Size does matter, and sometimes smaller is better. Such was the case in the earliest days of the space race.

Indeed, the first human in space was chosen because out of his total class of 20 space voyagers-in-training, and out of his elite class of six, he was the shortest. The first two Russian Vostok space capsules were so small—and weight was such a key factor—that the 5-foot-2 Yuri Gagarin beat out others in his class of six original Vostok cosmonauts by several inches. His closest runner-up was Gherman Titov, who was only one inch taller than Gagarin.

In Russia, on April 12 each year, citizens celebrate something called Cosmonautics Day, an annual event recognizing the great achievements of the combined space programs of the Soviet Union and post-communist Russia. Back in 2011, the holiday was officially rechristened as International Day of Human Space Flight, more cumbersome to write or say perhaps, but the commemoration remains the same: recognition of that day in April 1961 when Gagarin became the first human to go into space.

Gagarin’s launch atop that Russian rocket—like the previous Soviet achievement of putting the small Sputnik satellite in Earth orbit—vastly accelerated the great space race between the Russians and the Americans, arguably the most intense technological battle between the Cold War superpowers, and, some historians have argued, a way for those powers to convert military animosity and the looming threat of mutual annihilation into scientific competition.

In those early days of the space race, the United States lagged behind the Soviet program. Sputnik came as a shock to the West, as did Gagarin’s achievement. But the challenge posed sparked a battle of wills which the Americans would eventually win with the 1969 Apollo mission moon landing (and subsequent lunar missions), and the space programs of both powers produced overnight heroes and a whole new vernacular of space science (see The Golden Age of Space Exploration: 30 Years After The Right Stuff; Thursday Review).

Yuri Alekseyevich Gagarin was born 80 years ago (March 9, 1934) in Klushino, in the Soviet Union, to a father who was a bricklayer and a mother who worked as a dairy milkmaid, both toiling on a collective farm in a small village. As a teenager, Gagarin worked briefly in a steel mill before narrowing his job preference to tractor engineering and repairs while in technical and vocational school. He excelled at his studies, and according to his biographical data, he showed an early hobbyist interest in aviation. As a volunteer in what is the Russian equivalent of the Civil Air Patrol in the U.S., Gagarin learned to fly, first, small biplanes, and later more advanced airplanes. A quick learner, Gagarin was flying MiGs for the Soviet Air Force in 1955 at the tender age of 21.

His skill made him an ideal candidate for one of the most challenging assignments in those days: reconnaissance and border flights along the Soviet border with Norway, north of Finland, and along the icy, stormy edges of the Barents Sea north of the Arctic Circle, ever-vigilant for the possibility of incoming American nuclear bombers which would surely arrive by way of the Arctic regions. It was lousy, dangerous work producing endless hours of solitude and sensory deprivation—nearly ideal endurance training, as it happened, when the brass in Moscow went looking for candidates to fill the bill in their top-secret space program. Along with 19 others, mostly pilots, Gagarin was selected to be among the first cosmonauts.

Like the early American astronaut program, much stock was placed in not only physical strength and stamina, but also mental capacity and psychological stability. Psychologists and military doctors rated Gagarin as a prime candidate, much in the same way that the “lab coats” and “smock docs” in the U.S. sought to filter out any aviator who might have difficulty with cramped spaces, vertigo, complex batteries of multi-tasking, sensory deprivation or sensory overload, or abject fear. It was understood, almost from the very beginning, that space travel was a risky adventure, subject to the caveat that pilots might die. Certainly most American and Russian aviators—like their pilot counterparts worldwide—accepted the risks of flight. But going into space was riskier still, subject to testing certain laws of physics not yet fully understood, much less mastered, by those bound by gravity.

Gagarin also possessed that same trait which could be vaguely understood, to paraphrase Tom Wolfe’s famous vernacular, as “the right stuff.” These attributes included not merely courage, but also rapid problem-solving, high math skills, attention to surroundings, attention to detail, clarity and brevity of communication, but especially a package of gifts among which were seemingly contrary combinations: modesty plus bravery; intellect plus physical prowess. Gagarin was also likeable, and a favorite among his peers. He often broke the tension with humor and jokes. His looks were boyish and affable, handsome and rugged; in a helmet, he looks to be an eerie composite of John Glenn and Neil Armstrong—easy grin, smiling eyes, dimpled cheeks and cleft chin, trademark gap between the two front teeth.

On April 12, 1961, Gagarin became the first man to enter space, and the first to orbit the Earth. The achievement would propel the American program to singular importance, as it was not acceptable to those in the U.S. or among its closest allies that the Soviets might gain an insurmountable dominance in space. Though the early space program was marketed by both powers as peaceful, there was an underpinning of military conflict accompanying every step and every launch; millions worldwide understood that the outcome of the “space race” might very well include an existential conclusion for either Marxist-Leninism or capitalist democracy.

Indeed, as Wolfe wrote with aplomb in his non-fiction work, The Right Stuff, the space race sparked the greatest surge of patriotism since the end of World War II—especially in the United States. American astronauts like Alan Shepherd and John Glenn were perceived as single combat warriors, trained to be launched into the heavens to joust with the likes of Gagarin, or Titov, or others. In the context of the early 1960s, the very fate of the world depended on meeting this challenge.

Gagarin became an overnight sensation in the Soviet Union, and a superstar for the Soviet marketing message worldwide. He travelled to every continent and scores of countries, making public appearances, participating in ribbon-cutting events, joining in radio interviews and making appearances on television. Among the Soviet-bloc nations, sitting next to him at a formal dinner was the highest honor, and standing next to him on a reviewing platform was the paramount photo op. At home in Russia, he was awarded the highest honor: Hero of the Soviet Union, the equivalent of the Congressional Medal of Honor in the U.S.

But just as many in the U.S. space program were challenged less by fear and physical demands and more by the intense public scrutiny, Gagarin suffered from the smothering layers of press attention and celebrity. A Beer Call social drinker in the sense that many pilots drank, after his Vostok flight Gagarin soon went from a drink or two each day, to a pattern of heavy alcohol consumption. Friends and associates say this was due in part to the trappings of celebrity—toasts, honors, parties, dinners—but others have said he was simply overwhelmed by the fish-bowl that had become his way of life.

He was also becoming embittered by his handlers in Moscow: a micromanaged schedule, intense scrutiny of his personal life (famously loyal to his wife prior to Vostok, he was rumored to have had affairs in his days as a celebrity), and the Kremlin’s growing concerns for his safety. He was greatly limited in his flights, and he was banned from any duty which might include serious risk. By the time he was promoted to the rank of colonel in 1963, he spent hardly any time in the cockpit or in the air, save for commercial flights for PR work.

After Vladimir Komarov was killed in his Soyuz 1 flight upon a failed re-entry, the military brass and Nikita Khrushchev prohibited Gagarin from future space travel and quashed any further discussion of the matter: Gagarin was far too valuable to the Kremlin for propaganda reasons. He was elevated instead to the position of assistant training director at Star City, essentially serving as liaison between the young cosmonauts and their superiors in the chain of command (Deke Slayton filled a similar role in the U.S. astronaut program).

But despite being grounded from space and facing heavy restrictions designed to minimize risk, Gagarin flew occasional missions in a MiG, sometimes routine training activities, sometimes for public relations purposes. In spite of every precaution to insure that he was kept safe, he was killed in a crash along with pilot Vladimir Seryogin, the result of unexpected bad weather. After his death, Gagarin was given the highest posthumous honor possible in the Soviet Union—his ashes were buried in a prominent location in the Kremlin wall on Red Square in Moscow.

Gagarin was indelibly stamped into the history books as the man who took those first steps into space, and the man who also served as the smiling catalyst for one of the greatest technological and scientific superpower showdowns in history. It is not possible to tell the stories of Americans like Alan Shepherd, John Glenn, Gus Grissom, Jim Lovell, Neil Armstrong or Buzz Aldrin without first telling of Gagarin’s pivotal role.

Ironically, democracy also played an important role in Gagarin’s selection. Out of the total 20 military men chosen for the initial Russian space program, Gagarin was their own fraternal favorite. After working together for many months of rigorous mental and physical training, the 20 cosmonauts were asked to participate in a secret vote—a sort of straw poll to decide who, as a group, they thought most deserved to fly in the first Russian rocket. Gagarin received 17 votes out of 20.

Though for decades the Kremlin would not acknowledge it, that informal election decided the outcome of the decision of who would be first in space.

Related Thursday Review articles:

The Golden Age of Space Exploration: 30 Years After The Right Stuff; R. Alan Clanton; Thursday Review.
– See more at: