Republicans gathered in Miami to debate with only days to go before primaries in Ohio and Florida; in a forum remarkably free of insults and mud-slinging, the four candidates talked policy; immigration, jobs, trade, ISIS and Mexico. Read the full article by Alan Clanton here.
By Earl Perkins, Thursday Review features editor
The director of an organization with close ties to U.S. Representative Corrine Brown has pleaded guilty to conspiracy to commit wire fraud, according to CBS47-Fox 30, The Florida Times-Union, and other media outlets. Earl Perkins examines how this scandal may return to haunt the U.S. Representative. Read the full article here.
Thursday Review features writer Jennifer Walker-James interviews the sisters and a cousin of J.B. Beasley, who, along with friend Tracie Hawlett, were brutally murdered in small town Alabama in 1999. The case has never been solved, but the family still seeks justice for the killer or killers. Click here to read the full story of J.B. and Tracie.
Thursday Review examines Donald Trump’s counter-intuitive strategy of bullying the press, commentators, and other politicians. Will Trump’s political-incorrectness damage him in the polls? And will he soon bolt for a third party run. Click here to read more about Trump’s Strategy and his war with the press.
Thursday Review writers look into the growing problems faced by Democratic front-runner Hillary Clinton because of her use of a private email account; TR also looks at claims by the White House that it did not know that Clinton was using an email address other than what was provided by the State Department. Read the full article at: http://www.thursdayreview.com/ClintonEmailsWhiteHouse.html
Thursday Review’s Alan Clanton looks at the recent vote by the FCC to define broadband providers as public utilities like power companies; but will the big internet providers agree to such an arrangement, or will the lawsuits topple the FCC’s position, again. Media Page article: The FCC Rules on Net Neutrality; Thursday Review, March 4, 2015.
Thursday Review examines Brian Williams’ recent troubles, and his resignation from the board of directors of the Congressional Medal of Honor Foundation. Media Page article here: Williams Resigns From Congressional Medal Board; February 20, 2015.
NBC News anchor Brian Williams has taken a leave of absence from his desk for at least two weeks; his departure comes as a result of questions about his retelling of a wartime incident in Iraq, now discredited; will the popular and well-liked anchor return? Or is his career now finished at NBC News? Media Page article: Brian Williams: A Brief Hiatus From NBC News.
Thursday Review‘s Alan Clanton looks at the 20 year old terrorism case in Argentina, a bombing in Buenos Aires which killed 85 people and injured more than 300; investigators say Iran was the culprit. But the prosecutor was found dead recently, shot in the head only one day before he was to present his case to Congress and issue an arrest warrant for Argentina’s President. Thursday Review Front Page article: Argentina’s Prosecutor Planned to Arrest Top Politicians; February 4, 2015.
Thursday Review‘s Alan Clanton looks at the legacy of Japanese journalist Kenji Goto, who just wanted to report on the humanitarian crisis Syria’s civil war has wrought; instead he was used as a pawn by ISIS and murdered. Can journalists safely report in the Middle East? Thursday Review Front Page article: We Are All Kenji Goto; February 2, 2015.
Thursday Review‘s Alan Clanton looks at the case of the missing 43 students; allegedly kidnapped by local police in Mexico, handed over to a criminal cartel, and murdered, their bodies burned beyond the ability to identify them, even using DNA. Parents want answers as the horrific crime rocks Mexico; Features Page article; Thursday Review; January 30, 2015.
Thursday Review editor Alan Clanton examines Freedom of the Press, Freedom of Expression, and Freedom of Religion in the context of the recent terror attacks in Paris, and especially the murders of editors and staff at the satirical magazine Charlie Hebdo; more than one million people rallied in Paris to protest the violence and show solidarity with those who died. See the full article at: More Than a Million March in Paris.
– See more at: http://www.thursdayreview.com/ParisTerrorAttacks.html#sthash.G7tl7cbC.dpuf
By R. Alan Clanton, Thursday Review editor
Late Tuesday night, when my fiancé glanced at news on her smartphone and told me that Ben Bradlee had died, she asked me if I knew who he was. Of course I knew, and I responded by saying he had been—among other things—the editor of the Washington Post. Then she asked, “what was his claim to fame?” He was the editor of the Washington Post. Enough said, I assumed. But, she repeated: but, what was his claim to fame?
Bradlee was editor of the Post during the presidency of Richard Nixon, she said. I smiled. In fact, some historians might argue, one could just as easily turn the equation around: Nixon was President during Bradlee’s tenure as the chief newsman at the Post. Arguably, as a direct consequence of Bradlee’s stewardship of the newspaper, Nixon would ultimately resign the Presidency.
Bradlee was the top editor at the Washington Post for more than a quarter of a century, and during that time he took the Post from a second-tier, second-rate status to a point when many could argue that the New York Times had—by the early 1970s—only one genuine rival, and it was in the nation’s capital, not in Chicago, Los Angeles or Boston. Bradlee accomplished that Herculean task by combining old school qualities of relentless hard, gritty work and supervisory finesse in the newsroom, with a tough form of journalism which bordered on aggressive. He was also at times fearless, as in his decision—in consultation with the Post’s then-owner Katherine Graham—to publish parts of the Pentagon Papers in spite of a robust attempt by the Nixon administration to suppress publication. Allied briefly with arch-rival New York Times, the Post successfully argued the case for publication in front of the U.S. Supreme Court.
Benjamin Bradlee was old school to a fault. He was born Benjamin Crowninshield Bradlee in Boston, Massachusetts in 1921, a direct descendant of a handful of the first colonials to arrive in New England in the early 1600s. His family tree is peppered with the sort of lineage which connects verifiably to European princes, duchesses, counts and kings. Born into wealth, he attended the Dexter School and St. Mark’s Academy, then, went on to Harvard where he double-majored in English and Greek. While at Harvard, he joined the Navy ROTC.
He also served in World War II, and because of his communication skills and writing talent, he joined the Office of Naval Intelligence in 1942, serving in the Pacific in communications roles throughout the war. But Bradlee was not always hunkered down in code-rooms and radio shacks—he saw action in the Battle of Leyte Gulf in the Philippines, one of the biggest and most violent battles of the War. He also saw action in the Solomon Islands and the Philippines.
Using some family money as the seed, Bradlee was briefly a part-owner of a small newspaper called The New Hampshire Sunday News, which he helped to found in 1946. After getting the fledgling paper up and running, he sold his share to his partners. Weeks later he went to work for the first time as a reporter for the Washington Post. His first stint with the Post lasted until 1952 when he went to work as a writer for the Office of U.S. Information and Educational Exchange (USIE), a bureau with the central purpose of preparing written material, brochures, articles and films for use by the CIA, State Department staff, and other government personnel. Rumors have persisted to this day that Bradlee may have participated in a CIA black operation in his stint as a reporter for Newsweek magazine in 1957, when he interviewed several members of a pro-Algerian rebel group opposed to French colonial authority in Africa. Later, Bradlee would become the Washington bureau chief for Newsweek. Ultimately, Bradlee would be instrumental in the sale of Newsweek—then on the auction block—to the Washington Post in the late 1950s. The $15 million sale would result in Bradlee being paid in stock in both publications, a move that proved profitable for both Bradlee and his employers.
Bradley continued writing and editing, and after proving his mettle as a reporter and editor, he was promoted to the job of managing editor of the Post in 1965, and later executive editor in 1968.
After taking over as editor, Bradlee successfully fused the in-depth feature writing and long-form journalism associated with major magazines, with hard-hitting, unflinching investigative pieces. He also shifted more front page and A-section emphasis to national political stories and the political process. It was in keeping with this dictum that Bradlee found himself and the Washington Post briefly allied with the New York Times. Each of the papers had come into possession of Xerox copies of thousands of pages of a top secret Pentagon report on the Vietnam War—a study which had been commissioned a few years before by then-Defense Secretary Robert McNamara and compiled by a dozen researchers, analysts, and writers. The study was titled innocuously U.S. Decision-Making in Vietnam, 1945-1968. Among the writers and analysts for that massive, 3000-page report was Daniel Ellsberg, a Rand Corporation employee and former Defense Department employee.
Once a strong supporter of the war in Vietnam, Ellsberg had become deeply disillusioned, questioning the morality of U.S. involvement in Southeast Asia. Ellsberg leaked copies of the so-called Pentagon Papers to reporter Neil Sheehan of the New York Times, and later, also to the Washington Post. Despite the fact that the Nixon administration was not implicated in the report (the study covered only the years up to 1968; Nixon was inaugurated in January 1969), President Nixon was outraged by the brazen misuse of secret documents, and may have also worried—unnecessarily, as it turned out—that Ellsberg may have come into possession of later documents purloined during the period when Nixon and Henry Kissinger had secretly developed plans to escalate the war.
Accordingly, the Nixon administration used the considerable resources and power of the executive to stop both the Times and the Post from further publication of the Pentagon Papers. The matter moved rapidly through the courts, and in the end the Supreme Court ruled in favor of the newspapers. The ruling was (and remains to this day) one of the most important freedom-of-the-press cases in American history.
Later, and perhaps most famously, Bradlee trusted his instincts and supported the work of two previously unknown young reporters—Carl Bernstein and Bob Woodward—when the Post stumbled onto the strange story of a burglary at the Watergate hotel and office complex. On the night of June 17, 1972, five Cuban-American men dressed in high-end suits were found plundering around inside the offices of the Democratic National Committee, which in those days leased office space at the tony Watergate. The five burglars were caught with electronic bugging equipment, envelopes with cash, a small notebooks—one of which contained backline phone numbers to the White House. Woodward was sent to the arraignment that weekend. Within a day or two, Woodward was joined by Carl Bernstein.
Their investigation would turn into the biggest political story of the 20th Century, and would ultimately result in the downfall of a president. But were it not for the craggy, profane, irascible Bradlee—whose reporting and editing instincts were at the top of their game—the great investigative story might not have ever got off the ground. Bradlee, like his two young reporters, sensed that the story had more to it than what appeared on the surface. Though he was sometimes tough on the pair, he backed them when things got tough—excoriating them when they made mistakes, demanding that they dig more deeply to get to the truth, shepherding them through the challenges, and, in short, insisting that their reporting reflect the Post’s fiercely competitive standards.
Bradlee also cultivated and expanded the role of an independent press. Patriotic to his core, Bradlee nevertheless felt that without independent journalism and aggressive, non-deferential reporting, that government, appointed officials and elected leaders would stray from their essential roles in a democracy.
Among other things, Bradlee raised the bar for accountability journalism not merely for the Post, but also for most major newspapers in America.
Prior to Bradlee’s stewardship as editor, the Washington Post had won only four Pulitzer Prizes. By the time Bradlee had retired, the paper had won 17 more Pulitzers.
– See more at: http://www.thursdayreview.com/BenBradlee10-22-14.html#sthash.4hGvk1ep.dpuf
Compiled by Thursday Review editors
Not since Earl Perkins’ long form retrospective look at Lynyrd Skynyrd has an article or review generated as much comment, reaction and backlash as Alan Clanton’s recent You’re Gonna’ Need a Bigger Foreign Policy. Scores of readers forwarded the article to friends and associates, and we registered a much higher-than-average rate of clicks as a result. We also received lots of comments via email, Facebook, Twitter and Linked-In.
Among those who wrote us were liberals, conservatives, neocons, peace activists, defenders and detractors of President Barack Obama, and even a few who complained about the contraction gonna’ which we used in the headline (an indication, perhaps, that they didn’t bother to read even the first paragraph of the article).
Some of the comments were pro-Obama. The core of those complaints were that we failed to fully attribute blame to the long-term Middle Eastern environment to Obama’s predecessor, George W. Bush, who—along with the neocons of his administration (Dick Cheney, Donald Rumsfeld, Paul Wolfowitz, Scooter Libby, et al)—took the United State inadvisably into two simultaneous wars for which there was no clear or coherent exit strategy, and one of which may have been based on faulty information or manufactured evidence (Iraq). Fair criticisms. And fairer still is the evidence that in the haste to purge Baghdad of all Baathist after the fall of Saddam Hussein, American policy-makers in-country set in motion the Sunni versus Shia sectarian divide once predicted by Colin Powell, a dissenter among Bush’s inner circle.
Other Thursday Review readers suggested that we went too easy on Obama—in essence giving him a pass on his failure to act more proactively during the early days of the Arab Spring, absolving him of responsibility for the chaos and disorder which inevitably followed in Libya, Egypt and Syria, and mollycoddling the President on his profound reluctance to enforce his “red line” in Syria and his reticence to take decisive action on Iraq’s Nouri al Maliki. Others pointed to the President’s unwillingness to quickly address the problem of refugee children entering the U.S. by the thousands during the spring.
In the meantime, President Obama and a score of other NATO leaders met in Wales this week to discuss the rapidly-evolving events in Syria, Iraq, Afghanistan, the Ukraine, Somalia, and other world hotspots. Though no specifics have yet to emerge from the NATO summit, there was, at least, agreement on the talking points, especially sanctions against Russia and a long-term commitment to confront and destroy ISIS.
Either way, our article sparked discussion and debate. Here is a sample of some of those comments, sent to us via Facebook, Google +, Linked-In, Twitter, or in dozens of emails:
John Herndon, Fort Collins, CO: Any objective assessment of the foreign policy of the years 2009-2017 will review how progressive weakness, driven by ideological presumptions and an unwillingness to learn from reality, brought the United States to a position of really unparalleled ineptitude and invited chaos to reign, emboldening the forces that have nothing less than the destruction of Western culture as their goal. When reviewing the foreign policy disasters of Britain and her allies in the middle-late 1930s which came to the disaster of 1939, Churchill wrote that “no war was more preventable” than the one which raged for six years, bringing civilization to the brink by 1945. We can only hope that a serious change in our course occurs soon, lest some contemporary of ours say much the same thing of the current age. [Mr. Herndon wrote a longer piece on this subject, an article which we plan to publish this week].
Mike Lanning, Minneapolis, MN: My father, a Korean War veteran and a liberal, blames this [the current spate of crises] on George W. Bush. But arguments which rehash the same old “it’s the last guy’s fault” line miss the point: the United States should have acted with precision and care at the outset of the Arab Spring. Instead, the White House under Obama’s watch chose to adopt a wavering “wait-and-see” approach, so fearful of war that it could not fathom any direct action other than harsh words, fake outrage and imaginary red lines.
Deborah with Gmail: Putin and ISIS are clearly part of the same larger template. They do not fear us [the U.S.], and they don’t respect our allies, for that matter. And [they] have less regard for human life. After Benghazi Hillary [Secretary of State Clinton] got mad, spewing out “what difference does it make?” Now we see the answer to her question.
Roger with Gmail, Melbourne, Australia: As long as the developed nations of the world (and those countries in various stages of economic expansion) depend on oil, these struggles will remain with us, violently. Russia wants to bargain with oil and gas, using it as leverage to make the EU compliant. ISIS, aside from its apparent goal of murdering anyone it encounters, actually seeks control of oil wells and refineries so it can generate its own economy and currency. The U.S. and the U.K. suddenly realize ISIS is within striking distance of Saudi Arabia and Kuwait—another potential disruptor of oil. Iran decides to become well-behaved…why?…they don’t want ISIS commanding their oil fields. Flare-ups in the South China Sea [could be said to] be about oil and energy. Spend a fraction of the money used to make war and use it instead to develop other energy sources. Then, watch this stuff fade away.
Cynthia, Phoenix, Arizona: It’s easy to blame President Obama for all of this, but that blame game fails to address the short-sighted policies of George W. Bush and Dick Cheney. Don’t blame the clean-up crew for the condition of the house the morning after the big party. Iraq was a time bomb set to explode a decade earlier.
Mel Garrett, Atlanta, GA: Well-written and thoughtful piece, and helps to explain some of the more confusing aspects of what I see in the news each day. These things seem so far away, but clearly this stuff could very easily appear on our doorstep very soon.
Elias V., Port Charlotte, FL: The neo-cons have been somewhat vindicated by the events of the last three years, and this was all a predictable outcome in Libya, Syria and Iraq. Obama’s conciliatory approach works nicely when it involves photo ops with the leaders of trading partners and economic allies, but it’s “a day late and a dollar short” when it comes to facing threats. The pendulum swung too far, from too much war to too little backbone.
Mauricio (with Yahoo), San Antonio, TX: I didn’t support going into Iraq in the first place, but once we were in, we should have understood the consequences for the whole region. We broke this, now we own it. Some of those ISIS maniacs are just part of Saddam’s old guard, party members we banished to the hinterlands. In 2004 they were just secular political hacks, now they got religion (or so they claim) and half of the weapons we left behind.
Angela (with AOL), Auburn, AL: Benghazi was a warning of things to come. What happened to having a proactive strategy in place, and why is that no one is accountable in the White House? Candid or not, the correct response of a U.S. President should never be “we just don’t have a strategy in place.”
Rick with Yahoo: ISIS murdered thousands in their race across Syria and Iraq, but it took the killing of an American journalist (at the hands of a British terrorist) to spur Washington into some kind of action. And when did Joe Biden become the White House hawk?
Ann in Richmond, VA: I’m old enough to remember when Presidents like Johnson, Nixon and Ford wanted to take decisive action, but then ran afoul of lazy bureaucrats and government lard. Now it’s the other way around—inaction and hesitation inside a White House that never meets a problem head-on, unless it is bypassing Congress.
Cory (with Gmail) in Boulder, CO: Never considered all of these troubles being connected somehow, but your story makes it clear [world events] are a part of a pattern, one conflict feeding off the other. The Butterfly Effect. Not sure I agree Obama is at fault for all of it, but clearly the rest of the world is looking for a leader.
David (with WOW!) in Naperville, IL: Found this article on Facebook. Two points. Unfair to blame Obama for the actions of bullies in other places, for there will always be bullies and aggressors. But I agree that this is the moment for the President to roll up sleeves and get tough, as long as we [the United States] have some partners on this. Putin, ISIS, Israel versus Hamas, Somalia…can’t go alone on these things, and we can’t afford all-out war.
Brett (with Hotmail): Dead dinosaurs. Why do we keep fighting over dead dinosaurs? Think it’s coincidental that ISIS went straight for the oil wells and refineries, even a hydro-electric dam? Think it is coincidence that Russia’s first move was not tanks but the threat of cutting off oil to Europe? Think the Saudi kings and princes want ISIS in their backyard?
Joan in Ponte Vedra Beach, FL: ISIS is exactly the sort of butcher army the world should expect when U.S. non-policy leads to chaos and mayhem in some parts of the world, and when the President’s weak responses in Europe and Asia invite a return to the Cold War. Vladimir Putin got what he wanted in the Ukraine, and his next moves will be calculated with Obama’s weakness in mind. As for Syria and Iraq, the White House has waited for three years and tens of thousands of dead to finally act.
– See more at: http://www.thursdayreview.com/ReaderCommentsForeignPolicy.html#sthash.gSxMxNd8.dpuf
By R. Alan Clanton, Thursday Review editor
The new moderator of Meet the Press, Chuck Todd, introduced his stewardship of the show last weekend with two key things: an exclusive one-on-one with the President of the United States (thumbs-up), and a debut show complete with a few rough edges and a few glitches (also, thumbs-up). Yes, thumbs-up.
It’s an age-old tension in electronic media: looks, charisma and starpower versus gumshoe journalism and fearless questioning of authority.
Meet the Press, a show so old that it is the longest running television show in the history—on any network, in any country where there are TVs—has itself ebbed and flowed under the tidal weight of this dynamic. The show was invented for radio in 1945, and its co-creator, Martha Rountree, ushered it effectively into TV only two years later. Rountree was a reporter, producer, and writer, and—by most accounts—an innovator in television in a day when it was not clear that TV would have much of a future. Even in the late 1940s she recognized the tension between the reporting process familiar in print, and the new template developing around a technology which could easily punish or reward looks and delivery.
Even in the earliest days of radio, old school print reporters groused bitterly that their rivals with the microphones were little more than actors with scripts and cue cards. Television changed the dynamic more, but reached a kind of stasis by the middle 1960s. By then, TV viewers in the U.S. liked the comforting visages of Walter Cronkite, David Brinkley, Chet Huntley, Frank Reynolds, Harry Reasoner, and others. But looks would prevail, and the rise of the slick, handsome anchor (Peter Jennings, Brian Williams, Katie Couric, David Muir) would become common to our understanding of networks and their fierce battles over ratings (and revenue).
But Meet the Press was never about all of that—or at least it wasn’t supposed to be. Meet the Press was never designed to be “news” in the traditional sense. It’s goal—and that of its competitor’s similar programs—was to break free of the newsbite-soundbite formula prevalent in the typical evening newscast (include something about dogs or cats or kids, talk about the “wild weather,” and always end on something upbeat). Meet the Press was meant to be the show that required a cup of coffee by the participants, and it was also crafted to break free of either template.
If viewers really wanted Brian Williams to host Meet the Press, NBC would have moved him to that position years ago.
But when the beloved Tim Russert died of a heart attack suddenly in 2008, NBC needed to make a quick decision. After a few months of letting Tom Brokaw fill in as temporary host, the network settled on David Gregory—a capable reporter and gifted TV journalist. Gregory has everything that the suits at NBC figured would be ideal: urbane looks, a natty sense of grooming and dress, a slick delivery, and what amounted to an anchor-desk-style approach to the show. He was seen as the logical, upward arc of a show which had been the home to the likes of Bill Monroe, Roger Mudd, Marvin Kalb, Garrick Utley, and of course Brokaw and Russert.
Arguments remain heated to this day about what role the host (in Meet the Press parlance, “moderator”) should play, and how important looks, delivery and slickness should be to the overall format of the program, which had more of a kinship with newspaper reporting. Rountree was herself a creature of print. Born in Gainesville, Florida in 1911 and raised in Columbia, South Carolina, she would work first as a reporter for The Columbia Record, and later, The Tampa Tribune. She never completed college, and the newspaper jobs were meant to keep her finances afloat until she could one day return to school. But her love of journalism and her mostly self-taught, hard-fought skills became her life’s work. About as old school as it gets. But then, in 1938, she moved to New York City, where—improbably, perhaps, inevitably—she went to work writing advertising copy for magazines and radio, and where she would later develop and write “singing commercials” for radio broadcasts, an advertising specialty she excelled at. In that sense, she was perfectly prepared for the strange mix of style and substance required in those earliest days of television news.
But Meet the Press always pushed back from the desk of style and slickness. And that tendency to repudiate canned, pre-prepared, scripted news is what gave the show its voice. Meet the Press was not even a press conference, per se, but rather an opportunity for one or two or perhaps four reporters to dig in on an issue—or a set of issues—with their guest. And this meant not letting the guest off-the-hook, as it were.
This is why Tim Russert fit the show so well for so many years (Russert held the post of moderator longer than any other host, from 1991 until his death in 2008; Ned Brooks held the job the second-longest, from 1953 to 1965).
Russert didn’t look or act or feel like a television reporter. He was often rumpled and on the verge of being disheveled, hunkered down in a stance that leaned in toward his guests. His delivery was neither smooth, nor polished, nor alliteratively glossy (“a lot of light alliteration from anxious anchormen placed in powerful posts…”), nor did he ever seem overly impressed with his guests. He hovered at the very edge of irreverence, all the while being cordial, polite, and smiling. Some said he reminded them of an impish, irreverent college professor—the kind you might have a beer with after class. He was also unsparing and blunt, but never sarcastic, something his Irish Catholic upbringing (perhaps) had taught him was possible while still being impeccably well-mannered and jovial. Because he exuded a kind of comfortable everyman charm and a bit of street savvy, Russert’s guests understood ahead of time they were required to answer candidly, or face a second hit—this time harder.
Gregory, for all his likeability and skills, did not fit that particular suit. Where Russert was direct but engaging, at times even blunt, the strategically adept Gregory was perhaps, at times, conciliatory to the point of deference. But at other times, Gregory was caught-up in showmanship (as in the occasion he brought a gun magazine for an assault rifle onto the set with him as a show-and-tell piece for an interview with an NRA spokesman). And Gregory’s impeccable delivery and pacing and diction mean that he bore a closer resemblance to Brian Williams, or to David Muir—who was recently promoted to the job of anchor at ABC World News after the departure of the Dianne Sawyer.
Is this where Chuck Todd makes an ideal compromise? Todd, like Russert, disdains the showmanship aspect of the process (though he does make a halfhearted attempt to conceal an obviously receding hairline by combing his thin dark hair forward). But Todd, like Russert, is otherwise suspicious of the kind of slickness personified by Gregory. Todd is also a bit of an everyman—from that now ubiquitous goatee, which means he could easily be mistaken for your air conditioning repairman or the guy who drives the boat on your next fishing trip—to the language of a questioner devoid of sugar-coating and equivocation. When President Obama seemed to hint that the U.S. would have to forge some kind of partnership with Syrian ground forces in order to fight ISIS, Todd winced and interjected an incredulous “who?” Todd (like Russert) is not afraid of contesting the remarks of powerful people. And Todd (like Russert) is not concerned, at least at the moment, with pleasing powerful people, and this point is perhaps the most important; Gregory, for all his numerous skills and talents, often seemed to be trying to win approval of his guests.
Though it is not clear what will happen to that glossy, colorful set—which is a far cry from the primitive-looking stage sets of the Meet the Press of past decades—Todd may also be in favor of downsizing the aught years set and its grand collection of books. (Disclosure: I happen to love that part of the current set, and some Sundays I expend measurable energy trying to read those numerous titles). Some parts of the set were apparently already in a state of transition this past Sunday, and Todd likened it to living in a house while it is being remodeled. The panel of journalists and commentators helps to return the show top its roots as a group process, and in that sense the glittering set and the impeccable lighting should be secondary anyway.
When the iconic show began to look like a long-form version of short form news, it began a slow process of decline. Meet the Press must again prove its relevance (and not just in the ratings battles) by inching away from the network news model.
Related Thursday Review articles: Debating America Each Week; book review, Eclipse of Equality: Arguing America on Meet the Press, Solon Simmons; review by R. Alan Clanton; Thursday Review. http://www.thursdayreview.com/MeetThePress.html
– See more at: http://www.thursdayreview.com/ChuckToddMeetThePress.html#sthash.7jOsfdzp.dpuf
Book review by R. Alan Clanton, Thursday Review editor
Facebook recently celebrated its tenth birthday. The multibillion dollar company, founded in 2004 by Mark Zuckerberg, has grown to be one of the most valuable corporations in the world, and its sole product is information and data.
There is no drilling for oil, no laying of pipelines, no ships upon the sea, and no mining of precious metals. There are no bottled or canned drinks, no assembly lines making cars and trucks, no factory churning out toaster ovens, shower curtains or computer components. Just data—your data, the data your several hundred closest “friends,” along with the data of roughly 1.3 billion other people around the globe who use Facebook. And unlike other multi-billion dollar industries, from Coca-Cola to Microsoft, from Taco Bell to Koch Industries, Facebook spends almost no money advertising its service.
Further, Facebook has no rivals, at least not in the traditional sense. Coca-Cola competes with PepsiCo, Wal-Mart competes with Target, NBC News competes with ABC News. Facebook’s last real competitor, My Space, faded into relative obscurity more than five years ago. There are others out there, like Tumblr, Google + and Linked-In, but Facebook’s predominance over its quasi-competitors makes any comparison lopsided in the extreme. For the vast majority of computer users and smart phone users, the ubiquitous Facebook is a tool as important as one’s wristwatch or ones credit card. For some, it may be more important.
But is Facebook a game-changer in the long history of human interaction and communication?
A new book by Tom Standage, Writing on the Wall: Social Media, the First 2000 Years, argues that Facebook is merely one in a long series of human inventions designed to make the spread of news and the dissemination of information easier and more reliable. Facebook may be more user-friendly and more democratic in its power to engage, but it is a logical—indeed inevitable—merger of technology with the human need to inform and be informed.
Highly readable and instantly engaging, Standage’s book starts with an explanation of the ancient and entirely human belief in sharing news and information and telling stories about the human condition. At the core of social media is the more primitive concept of the social pack or societal unit, which served a useful and, as it turns out, essential service for its members: food, shelter, protection, family equilibrium, grooming. Facebook, in which the average user has roughly 130 friends, replicates with eerie precision the social networks of humans even thousands of years ago, when the average hunter-gather clan would top off at about 145 to 150 people. This is known as the Dunbar number, and it indicates the largest size of any community in which everyone could know with some intimacy everyone else in the clan, for above this number some people would be strangers to one another. Further, physical grooming was replaced with social grooming, in the form of news, gossip, storytelling, and social interactions designed to vet and filter information.
For this reason, Standage argues that the human brain is hardwired for social networking, with tens of thousands of years of fine-tuning all forms of direct and indirect communication. From cave drawings to stone tablets, from early hieroglyphics and the first systematic written languages, humans have sought to find the most useful ways to pass along critical information, as well as develop tools to develop ways to filter information for reliability.
Filtering and vetting information becomes of great importance as human history progresses and languages become more complex. And reliability of news and data also becomes critically important along the way as well, as humans must learn to sort out disinformation from truth, officialdom’s propaganda from balanced reporting and objective evidence. Think of Russia’s seemingly absurd campaigns of disinformation regarding the crash of Malaysian Airlines MH-17 over eastern Ukraine; or, likewise, its recent incursions into the Ukraine despite months of telling the world that vast military movements near the border were simply Army exercises.
Standage traces the lineage of mass communication and interpersonal dispatches from the time of the Greeks and the Romans through the invention of the printing press. The ancient Greeks invented and perfected outdoor graffiti as a form of interpersonal communication—writing on walls and creating newsfeeds—two and a half millennia before Facebook. Cicero used papyrus documents to present news and reviews, then, asked those who came in contact with the information to add their own commentaries and interpretations. Among Julius Caesar’s various contributions to social media: the development and founding of a prototype newspaper—hand-written, but copied by involved citizens and urged upon those traveling within the Roman lands. Today’s iPads, Kindle readers, Nooks and other devices—dazzling though their abilities are—nevertheless bear a striking resemblance to early clay and wax tablets, which were carried by hand or in bags.
Social media—as we understand it—is not new. It represents merely a thread of human interaction embedded deeply in our desire to understand our world, our community, and to connect to those closest to us. What has so radically altered the template has been technology, a tool which has allowed billions of people worldwide to connect using universal tools on computers and smartphones. What was once information spread and disseminated by hand, face-to-face, or in small groups—much the way the word of early Christianity was spread to hundreds, then thousands, then millions, starting with only a few dozen people—can now be sent to thousands within seconds. When Martin Luther sought to repudiate what he saw as a sclerotic, even corrupt officialdom in the church, he used nothing more elaborate than a list posted on a door—which in turn was copied, then copied again, by hand, in what amounted to a declaration gone viral.
Politics has often played a part in social media. The pamphlet and the handbill were early forms of proselytizing political views and societal struggles. Printed handouts were sometimes decisive in the cultural and political changes which swept France, Russia, Great Britain and the United States—thus literacy moves hand-in-hand, with political awareness and social advancement. Centuries before Dakota or Starbucks—with the Wi-Fi and the smartphone charging stations—coffee houses were used as a place to hold forth, compare ideas and ideologies, challenge conventions, and foment revolutionary ideas. Like the internet, Facebook, and Twitter, coffee houses were accused of breaking down social skills and encouraging an institutionalized form of wasted time.
In short, are our social media platforms—Facebook, Google Plus, Linked-In, Twitter, Pinterest—so radically different from the way humans have engaged for thousands of years? Or are they simply the logical merger of digital technology with the human need to connect and share.
Taken as a whole, Standage’s book is highly readable and moves very smoothly. Its only fault—minor, to be sure—is that some chapters seem to belabor his point well after he has made the point quite effectively. Still, it’s easy to overlook this indulgence since he tells the story of social media so well and with such striking comparisons. A fast, fluid, addictive read; and more relevant than a dozen other books on the great technological and business disruptors of our day.
Related Thursday Review articles: Beware the Siren Servers; book review of Who Owns the Future, Jaron Lanier; review by Alan Clanton; Thursday Review. http://www.thursdayreview.com/WhoOwnsFuture.html
– See more at: http://www.thursdayreview.com/WritingOnWall.html#sthash.T6iKj24V.dpuf
By R. Alan Clanton, Thursday Review editor
According to almost everyone who knew him, the American photojournalist James Foley was one of those people that you instantly liked. He wouldn’t know how to make an enemy if he tried. Nevertheless, Foley was regarded as an enemy of ISIS, or, perhaps more troubling—as a pawn to be used as leverage.
Unblinking but empathetic to the subjects he photographed—and an honest broker among the subjects he covered when gathering news—Foley’s horrifying death at the hands of ISIS extremists is reason enough to fear not merely the message being spread by the militants but also their twisted world view.
Foley was an unlikely enemy of ISIS in that sense—neither a spy nor someone easily misunderstood as a spy. But Foley had the misfortune of being captured at gunpoint along the border between Syria and Turkey two years ago. He had been freelancing for several media outlets, including The Global Post (Boston) and Agence France-Presse. Witnesses say that he was abducted by militants who took him from one car and tossed him into another.
Foley was never seen or heard from again until his image appeared in a video, shot and edited by ISIS fighters, which appears to show the photographer being beheaded by an ISIS militant.
Over the last 20 months, his parents had pleaded with the militants (it was not entirely clear who had abducted him in November 2012) to show mercy and release him. Some analysts suggested that Foley would eventually be used in a prisoner trade with extremist groups—Foley in exchange for captured al Qaeda or ISIS militants.
But the dynamics have shifted wildly since the spring, when the de-evolution of conditions in northern Syria gave rise to a more extreme version of anti-Assad militancy. Lawlessness and chaos enabled ISIS to gather momentum, and in June the militant army sprang across northern Syria and into Iraq. Ahead of its advance, Iraqi soldiers retreated, in many cases abandoning their weapons and the vehicles, and in the process allowing ISIS to become even more heavily armed. ISIS has since spread fear and terror across a wide swath of the Middle East, sweeping into towns and cities, forcing the immediate conversion to strict Islamic law (as it is interpreted by ISIS), and summarily executing anyone who did not comply. Beheading became the punishment of choice in some cases, though ISIS has also released many videos which show people being executed at gunpoint, their bodies pushed into hastily dug ditches and mass graves.
Yezidis, an ethnic and religious minority concentrated heavily in the area around Mount Sinjar, in northwestern Iraq near the border with Syria, became the target of ISIS’s most recent assaults. Yezidis in a dozen towns and villages were forced to evacuate as ISIS fighters approached, and tens of thousands of civilians fled into the hills and onto Mount Sinjar. Many told horror stories of watching as men were shot, women were raped or tortured, or small groups of Yezidis were executed by gunfire or beheadings.
As a humanitarian crisis unfolded for the thousands trapped on Mount Sinjar, U.S. President Barack Obama ordered targeted air strikes on ISIS positions. More than 100 strikes have taken place since the air campaign began 9 days ago, and a combination of Iraqi forces and Kurdish fighters have begun to make a considerable pushback against ISIS positions on the ground. Just days ago, Kurdish and Iraqi troops waged a hard-fought battle to regain control of the dam at Mosul, and other ground detachments have been moving to retake other key locations, including oil fields and oil refineries captured by ISIS last month.
The U.S. air strikes have dealt a serious blow to ISIS, disrupting their movements, destroying vehicles and weapons stockpiles, and killing more than 30 militants. The killing of James Foley was intended as a very public form of revenge against the U.S. for its policy of attacking ISIS. In the video of Foley’s death, an ISIS militant threatens more beheadings of captured Americans if the U.S. military continues its campaign against ISIS forces on the ground. Now believed to be at grave risk is American journalist Steven Sotloff, who was freelancing for Time and The National Interest when he was kidnapped one year ago.
Foley, who turned 40 this year, was widely liked by other journalists, videographers and photographers. He was admired not only for his self-deprecating humor, his skills as a photojournalist, but also for his sharing, giving nature and his empathetic approach to his subjects. Foley had been in Syria covering that country’s brutal civil war when he was abducted. Foley often said that he felt it was his calling to be a front-line journalist—that is, a reporter and a videographer willing to put his safety and life at risk if that was what it took to bring the human story of war to a world audience.
Foley had been captured by insurgents before, in Libya in 2011, and was held for 44 days. Despite the experience, Foley insisted on continuing to report from troubled places, including Syria’s brutal civil war.
“I still want to be a conflicts journalist,” he told the Boston Globe after his release in Libya, “but I realize this is life and death.” When Foley was abducted in Libya, he was one of several journalists who witnessed South African photographer Anton Hammerl being shot dead right in front of them.
Foley was working in Syria when he was captured in November 2012. Secretly, the U.S. military had attempted a rescue mission earlier this year to free Foley and other journalists, but the top-secret mission apparently failed because the hostages were not in the location which Pentagon intelligence had led them to regard as the key target.
A video released by ISIS a few days ago shows Foley dressed in an orange jumpsuit, his hands apparently tied behind his back as he kneels in a featureless dessert landscape. Next to him is an ISIS militant dressed entirely in black, his face hooded. After a few minutes in which Foley is allowed to speak, the militant issued threats—interestingly in a British accent tinged with a hint of either Liverpool or Scotland. The militant then produces a large knife and begins to cut Foley’s throat, though the video fades quickly to black.
The U.S. National Security Council verified the video, and in a statement said that “we are appalled by the brutal murder of an innocent American journalist, and we express our deepest condolences to his family and friends.”
The Committee to Protect Journalists has condemned Foley’s beheading, calling ISIS’s actions barbaric.
Foley’s killing may have also had the effect of drawing other nations into the fray against ISIS. So outrageous were the circumstances of his death that Italy and Germany announced their intention to begin supplying weapons to Kurdish fighters (and other minority groups within Iraq and Syria) to aid in their battle with ISIS.
President Obama angrily compared ISIS to a cancer, and promised that the U.S. air campaign would continue unabated despite ISIS’s threat of more beheadings and killings. More air strikes were conducted by U.S. forces today in the area near the Mosul dam and in areas south and southeast of Mount Sinjar.
Gary Pruitt, president and CEO of the Associated Press called the murder of a journalist during wartime an international crime.
Related Thursday Review articles:
Iraq Airstrikes Continue; Maliki Steps Aside; R. Alan Clanton; Thursday Review; August 15, 2014.
– See more at: http://www.thursdayreview.com/JamesFoley8-20-14.html#sthash.jGzSsOFX.dpuf
– See more at: http://www.thursdayreview.com/JamesFoley8-20-14.html#sthash.HDNSWYIZ.dpuf
By Earl Perkins, Thursday Review features editor
Without libraries, opposable thumbs may be one of the few things separating us from supposed lower life forms. So when city and state governments nationwide begin cutting budgets, and one of their first choices is often associated with libraries and media centers (which, by extension, means educating young people), we should be concerned. If the world faces serious problems, don’t you think it would be nice to educate the next generation so they at least might have hope for a better life?
“Library funding behavior is driven by attitudes and beliefs, not by demographics,” according to a report by the Online Computer Library Center (OCLC; formerly known as the Ohio College Library Center). “Voters’ perceptions of the role the library plays in their lives and in their communities are more important determinants of their willingness to increase funding than their age, gender, race, political affiliation, life stage or income level. The more that can be learned about library perceptions, the better the chances of constructing a successful library support campaign to improve library funding.”
Libraries continue to meet society’s changing needs, despite dealing with recession-driven financial constrictions and federal neglect, according to National Center for Education Statistics. Numerous school libraries face elimination or de-professionalization of programs, the study stated. Most libraries would be thrilled to remain relevant and keep their doors open, with a recent independent national survey showing that 90 percent of respondents said libraries are important to communities, according to the American Library Association.
These and numerous other library trends studied throughout the past year were discussed in the ALA’s 2014 State of America’s Libraries report, which was released during National Library Week this past April.
The majority of federal library program funds are distributed through the Institute of Museum and Library Services to each state. The Library Services and Technology Act (LSTA) is part of the annual Labor, Health and Human Services and Education Appropriations bill. The ALA’s Washington office spearheads numerous activities, which include lobbying for LSTA funds, communicating with Congress concerning funding for federal libraries, the Library of Congress, the National Agricultural Library and the National Library of Medicine, among others.
ALA representatives seek funding for the National Endowment for the Arts and the National Endowment for the Humanities, along with backing the causes of adult education and literacy. They also push for libraries to become involved in education programs, including those for early childhood education.
The majority of library funds come from state and local sources, but federal funding provides critical assistance, giving libraries nationwide financial support needed to serve communities. The amount of funding a library receives is directly proportional to quality of services.
Federal support for libraries has been falling the past several years, capped by severe funding cuts to LSTA and many other vital programs, even forcing some facilities to close. The ALA closely monitors numerous programs, because libraries are interrelated with education, the humanities, the arts and other social functions.
The U.S. Senate Appropriations Subcommittee on Labor, Health and Human Services, and Education and related agencies (Labor-HHS-Education), marked up FY 2015 on June 10. This spending bill includes several important programs including LSTA and Innovative Approaches to Literacy (IAL). It funds LSTA at $180.9 million, which matches this year’s totals. IAL level funded within the bill’s report language, thus funding the program at $25 million for FY 2015. At least half the money appropriated to IAL must be set aside for a competitive grant for low-income school libraries.
The bill provides $156,773,000,000 in base discretionary budget authority, matching the FY 2014 level, while including $1,484,000,000 in cap adjustment funding, a $560,000,000 increase to prevent waste, fraud, abuse and improper payments in the Medicare, Medicaid and Social Security programs.
U.S. Senator Tom Harkin (D-Iowa), chairman of the Labor-HHS-Education Subcommittee, had high hopes that the legislation would improve education in America.
“This is the bill that invests in America and allows us to respond to the changing needs of our country, all within a difficult budget,” he said. “I am particularly encouraged that the bill directs funding to investments in high-quality early childhood care and education, which have been proven to have positive, lasting effects on children and families.
“The bill also invests in programs that support working families and contains funding that allows for an increase in the maximum Pell Grant award,” Harkin said, “which is critical for expanding access to higher education. All in all, this bill takes a thoughtful approach to funding these critical programs because this bill funds America’s priorities; it is the bill in which we invest in our future.”
Related Thursday Review articles: How a College Library Thrives in a Digital Age; R. Alan Clanton; Thursday Review; April 18, 2014.
New Yorkers Challenge Plan to Gut Landmark Library; Thursday Review; May 14, 2014.
Editor’s Note: This is the third in a series of articles about libraries in the digital age; future segments will include looks inside public libraries, large and small; as well as other media center venues. – See more at: http://www.thursdayreview.com/LibraryFunding.html#sthash.993LE9SZ.dpuf
By R. Alan Clanton
Thursday Review editor
(Originally published July 11, 2014) Back in February of this year, when Brian Roberts, CEO of Comcast, announced Comcast’s proposed buyout of Time Warner, there was a storm of media coverage about what the merger of these two cable and internet giants would mean. The majority of the press narrative surrounded customer service: would such a mega merger be good for customers in the long run?
The overwhelming response by those who joined in the discussions regarding the massive merger: no. Most feared longer on-hold waits when calling for customer service or technical support, longer delays in the field during outages or service problems, higher rates and new fees, and—in general—abysmal service from two companies with already poor customer service rankings.
Some economists and business analysts looked at the equation from the standpoint of jobs: layoffs will surely follow in the wake of such a massive merger; more call center and tech support operations will migrate overseas; thousands of blue collar, white collar and office support people would be dumped into unemployment. Cynics said nothing much would change at all.
But some realists, and a few optimists, saw merely an inevitably shifting market with new opportunities: Other technologies would continue to spring to life, especially as younger consumers sought content in unorthodox and unconventional ways. Technical innovations would arrive—as they seem to arrive daily—giving us alternative tools to TV content and entertainment choices. Satellite companies would likely gain customers, and their increased revenue would give them more leverage to expand and update their own technological strengths.
But, at about the time that DirecTV was seeing a surge in new customers—mostly those engaged in a pre-emptive bailout from cable in Time Warner and Comcast areas—AT&T announced its proposal to purchase the satellite giant outright. The marriage of AT&T and DirecTV was, in fact, a logical response to recent Comcast acquisitions. Not wanting to be left standing without a chair when the music stops, other media and telecom giants are looking to shore up flanks and find new ways to remain relevant in an age in which technological changes create paradigm shifts almost overnight. Everyone is affected: Verizon, Apple, Amazon, T-Mobile, Sprint, Charter, Cox.
And remember Aereo? Its tiny dime-sized device—basically an antenna for receiving and storing TV signals—threatened to unravel the business model of both broadcasters and cable television. Aereo’s little device was so disruptive to CBS, Fox, ABC and NBC, that the contentious matter ended up in the Supreme Court, where justices recently agreed with broadcasters and declared Aereo’s antenna a tool for theft.
Aereo lost its case, but you can bet that there will be more digital disruptors and existential challenges very soon, especially as Facebook, Amazon, Apple and Google seek to grab more hours of our collective attentions. It was for this very reason that Comcast justified its need to merge with Time Warner, and those same imperatives apply as AT&T makes its case to Congress for the right to acquire DirecTV. The mergers will continue, and customer relations may suffer.
But lost in the hue and cry over customer service has been what some technologists regard as the more transcendent issue in these massive mergers: net neutrality.
Net neutrality, for those unclear on its passive, almost oblique language, is the basic philosophy that says that all internet traffic should be treated equally. Net neutrality is to internet content what Lady Justice is to the law: a formidable, stoic presence ensuring evenhandedness and fairness, and blindfolded to issues of creed, color, convention and context. Net neutrality means that your cable or internet provider will treat equally all traffic coming into your home: streaming movies, online games, access to banking or retail activity, music downloads, payment activities, photo uploads or downloads, television content, music videos and other short video material.
Advocates of a free and open internet regard its neutrality as crucial—a fundamental tenet essential to the free flow of information and the growth of innovation. And there are other comparisons employed. Just as your landline phone is neutral, able to make or receive calls in an unfettered landscape (unless you choose to block certain callers), so too should your internet access be unhampered. Just as your electricity flows into your home on an equal playing field with that of your neighbor, so too should your web access.
Originally a cherished and critical element in the thinking of the FCC and other Federal agencies, net neutrality has seen slippage over the last decade. In an important change of direction, the FCC in 2002 declared that the internet was more akin to an information service than a public trust. In that sense, according to then-chairman Michael Powell, your internet service provider (ISP) was more like a magazine or newspaper: you can subscribe to it—for a price—but there are no guarantees about its content, which comes at the discretion of owners, publishers, and editors.
That change of philosophy set in motion a slow but inevitable shift in the winds. But the tide has ebbed and flowed well into the late aught years. In 2010, largely as a result of complaints that Comcast was interfering with its customers’ web access and internet preferences, the FCC took a step back toward a policy of neutrality by reaffirming its view that the internet should be treated in the same way that government treated phone lines in the early part of the twentieth century: the web is a public trust, and its architecture serves as a “common carrier.” The FCC’s 2010 position was imperfect, and many technology advocates complained of the loopholes, but it was an important step in the direction favored by those who want to see the freest flow of information and content.
But early this year, a U.S. Appeals Court ruled against the FCC and in favor of Verizon; the case had centered upon Verizon’s arrangements wherein the mobile phone giant was charging additional fees to companies like Amazon, insuring those companies premium access (meaning faster speeds), and relegating other applications and services to a slower lane.
The issue has now become a political challenge facing the administration of Barack Obama—and probably the administration of the next U.S. presidents. Back in 2008, Obama had campaigned on a promise that he would honor net neutrality, but in practice he has done little to re-establish that central canon. Recent telephony and cable television mega mergers have made it abundantly clear that the time is ripe to establish a core value system for the internet’s rules-of-the-road.
That Appeals Court decision in the Verizon case opened the door to more preferential treatment, and by extension it created an avenue for more revenue for ISPs offering premium access. Comcast and Netflix recently came to an undisclosed arrangement whereby, for a fee, Netflix can stream its content across Comcast’s vast infrastructure without inhibitions on speed or quality. Likewise, AT&T enabled iTunes to have a special “fast lane” across its massive architecture, but only after Apple negotiated for the use of that specially tweaked speed and access. Meanwhile, services similar to iTunes, like Spotify, are left struggling with inferior access simply because they are unwilling or unable to negotiate preferential treatment.
Advocates of an open and neutral internet agree that what suffers most from a tiered arrangement is technical innovation. Smaller companies, unable to pay the premium fees charged by a large ISP like Comcast or AT&T, would face serious challenges to the development of their products and services, and some of those hurdles would be insurmountable without the same access to the internet as other companies. Worse, some fear that a pay-for-speed web would begin an organic process favoring almost entirely the biggest players—those with deep-pockets and little to fear from negotiating privately with big ISPs. Smaller players would be forced from the table quickly, and even those who survive would almost certainly be forced into shotgun marriages with other, larger firms—a cycle of larger companies buying out the smaller ones.
There is also the specific concept of “the last mile” of delivery. Without some form of regulation to maintain an open, unfettered internet, big ISPs become gatekeepers and key-masters in one stroke. Innovative web companies, tech start-ups, hardware makers, software designers, and content creators may have great products and services, but if a big carrier can establish a rate structure for content based on speed and reliability, the benefits and value of these products are lost.
Consumer advocates worry that tiered-price arrangements for content providers will increasingly translate into equally complex pricing and fees for subscribers and web users. In the near future, when Comcast completes its merger with Time Warner, at least one third of the U.S. will be inside the footprint of the Philadelphia-based cable company. That means that customers may experience the full effect of an internet that is anything but neutral. And some business analysts worry that the effects of a stratified U.S. internet could easily spill over into similar behavior in markets around the world. And it may also stifle technology in the U.S. while driving it into foreign markets.
The January court decision was not a total setback, however. Verizon had gone so far as to argue that the FCC had no authority to regulate broadband or wireless access, a position the court rejected totally. But the court basically ruled that the FCC’s 2010 working-position was akin to an overreach. The FCC, according to U.S. Circuit Judge David Tatel, should have more carefully linked its requirements regarding net neutrality to the concept of common carriers. The problem sprang from the FCC’s own bipolar thinking, and dates back to Powell’s 2002 interpretations of ISPs as information services. That 2002 working-position was a legal time bomb, and when in 2010 the FCC attempted to reassert net neutrality, the fuse started burning.
But experts say that now is the time to rewrite and retool the guidelines to better encompass the rapid technologies now unfolding around us.
Some Thursday Review readers have commented in the past about articles we have posted on this topic, especially in the context of recent mergers. One reader, someone familiar with the cable business from the inside, said that the original thinking on internet access was right all along—it’s just that we didn’t collectively see the metaphor as flexible.
“In the 1990’s we called it the Information Superhighway,” he said, “and advocates of a free internet understood that to mean just what that image elicited—lots of cars and trucks moving along at a moderate-to-fast clip. But what was not seen was an age in which there were special lanes for carpooling, lanes for buses, lanes for electric cars, bike lanes, wheelchair lanes, you name it. And there are toll-booths. Some people have speedpay, others pay by the month, some by the year.”
In other words, to paraphrase George Orwell, some web access is more neutral than others.
Verizon, Comcast, Time Warner, AT&T and other ISPs also make the case—and it is not unreasonable—that it is, after all, their capital and their labor which builds the highway. Should they not, under such circumstances, be able to charge more for use of the fastest lanes? And if Amazon and Netflix are willing to pay more for the fastest streaming available to Comcast subscribers, would not that additional revenue stream enable Comcast to reinvest in bigger, better, faster highways for everyone? Maybe.
But not so fast, some say. In that January 2014 court case, the decision was based on a three-judge panel. Even then there was dissent. Judge Laurence Silberman, writing a minority opinion, said that new innovations are always at work, sometime undermining conventional business models.
“This regulation [of internet speeds and reliability],” Silberman wrote, “essentially provides an economic preference to a politically powerful constituency, a constituency that, as is true of typical rent seekers, wishes protection against market forces.”
In the meantime, a billion or more people are now connected to the internet via broadband, telephone infrastructure, or through handheld devices. That number grows by millions worldwide every day. Even giants like Comcast, AT&T, Verizon, Deutsch Telekom, T-Mobile and others cannot keep up with that growth. Thus companies like Google, Facebook and Amazon enter into serious consideration of space hardware and drone technology to bring internet access to still more millions.
In the end, innovation suffers most. Without net neutrality, the web’s core value gets inverted. The internet has enabled tens of thousands of business start-ups to blossom, often with little or no investment of cash, and this influx of dazzling competitiveness has radically realigned the marketplace and reshaped the economy. The power of a neutral highway means that thousands of little companies have the same opportunities as the dozens of big companies, and this can sometimes have transformative effects (just look at Blockbuster, Borders and most major daily newspapers). That level playing field ought to be honored as central to a dynamic economy and a capitalist framework.
Thursday Review will have more on this important topic in the near future, and we invite our readers to send us their opinions and comments on this issue.
– See more at: http://www.thursdayreview.com/NetNeutral.html#sthash.CnlHWlap.dpuf
By R. Alan Clanton | published June 18, 2014 | Thursday Review editor
County commissioners in Arlington County (Virginia) recently approved a proposal by a company called Monday Properties to demolish the multi-level parking structure once used by Washington Post reporter Bob Woodward and his secret contact for Watergate investigation information. Back then, Americans knew the name of Woodward’s source only as Deep Throat, a moniker penned by Managing Editor Howard Simons to describe Woodward’s extremely shy contact as “being on deep background.” Decades later, a former top FBI agent named Mark Felt would reveal himself to have been the secret source.
The demolition of that parking garage will take place sometime next year, and its destruction will make way for a high rise apartment building and a small retail complex. There is already a commemorative marker on the street near the garage now, explaining the historical significance of the site, but the county and the developers agreed to enlarge the signage, possibly adding additional materials and even photographic displays. The passing of that garage was, perhaps, to be expected. Some buildings of historic significance survive, some do not. A parking garage hardly compares to Monticello, or the Custis-Lee Mansion. In the meantime, a more important milestone has been reached in the long shadow of Watergate. Forty years ago, newspapers and book publishers printed the first editions—almost all of them in paperback—of the Presidential Transcripts. Both the New York Times and the Washington Post partnered with publishers (The Times with Bantam, the Post with Dell) to make available, for about $2.50, a massive paperback with the complete text of the tapes released by Richard Nixon and his staff in 1974. The subtitle of the book was in fact the official title of the materials handed over by Nixon to Congress: Submission of Recorded Presidential Conversations to the Committee on the Judiciary of the House of Representatives by President Richard M. Nixon. A more cumbersome namesake than just “The White House Transcripts,” as those pages became popularly known. By the time of the publication of the books, Watergate had become a national obsession, with hundreds of reporters working full time on the topic, and the majority of nightly broadcast news devoted to the investigation. The whole thing had started on the night of June 17, 1972 (42 years ago, yesterday), when five Cuban-American men dressed in upscale suits broke into the offices of the Democratic National Committee, which at the time was in the Watergate, an upscale multi-use suite of offices, apartments and hotel in Washington, D.C. The burglars were caught with electronic bugging equipment, thin envelopes of cash (the $100 bills were in sequence), and a couple of small notebooks which contained some phone numbers of people at the White House. Woodward was sent to their arraignment, and the investigation began in earnest, with Woodward working alongside veteran reporter Carl Bernstein. In addition to the Washington metro police, the FBI began investigating. Working in a parallel trajectory, Woodward and Bernstein did their own gumshoe work even as the FBI moved slowly with its massive inquiry. By early 1973 the case had found its way to the attention of Congress, and into several courtrooms. Hearings were launched on Capitol Hill, with all the predictable grandstanding by politicians, courtroom theatrics by attorneys and counsel, evasions and obfuscations by witnesses. One of those called to testify was Alexander Butterfield, a mid-level staffer and security expert for Nixon. A day or two after rumors began circulating that some of Nixon’s conversations might have been tape-recorded, Butterfield was asked directly by Senate Counsel Fred Dalton Thompson if there was a taping system. Butterfield not only acknowledged the taping, but said it had been expanded to include most offices in the EOB, and that the system was voice-activated. Contrary to the widespread mythology of Watergate, Nixon did not install the taping system. The first tape recorders were installed during the last year of the administration of Franklin Roosevelt. The recording systems were expanded and upgraded by subsequent presidents, from Truman to Eisenhower to Kennedy. Lyndon Johnson further expanded the taping system, upgrading it even more, and it was during Johnson’s tenure that tape recording systems were added to key phone lines as well, including in the Oval Office, the Lincoln Sitting Room, and the several smaller offices. The recording systems were eventually centralized and housed in a small utility room on a lower level of the White House, where they were maintained and checked daily by the Secret Service. Several agents rotated the seemingly mundane duty of swapping out full recordings with blank tapes, and making sure that all the machines were operating properly. Tapes were labelled by hand, and placed into cardboard boxes or on shelves. Shortly after Nixon became President, newer Sony and Uher recorders were installed to replace the older gear, and by some reports as many as ten recording mechanisms—using scores of small lavalier mics in eight different rooms—were in operation by 1972. In addition, Nixon asked that phone recording systems be expanded and upgraded as well. The President’s secretary, Rose Mary Woods, had an expensive, top-of-the-line Uher 5000 machine near her desk, available for playback and transcription. The essential fact of Nixon’s role in the taping system was his suggestion, probably in 1971, that all taping mechanisms activate automatically upon the start of conversation in any of the monitored offices. No longer would someone have to manually—and emphatically—press a button to begin recording what was being said. Soon after Butterfield revealed the existence of the previously secret taping system, members of both the House and the Senate were determined to gain access. If Congress and the courts could find evidence that Nixon had been ordering his loyal lieutenants to interfere with the investigations or to destroy evidence, then Nixon could be charged with obstruction of justice. Seeking to verify some of what John Dean had told investigators, the special prosecutor, then Archibald Cox, wanted access to specific tapes. The White House stonewalled. Cox asked District Court Judge John Sirica to send subpoenas to the White House requesting eight of those tapes. Nixon still refused to release any tapes, citing executive privilege. His reasoning had some legal basis, and though it was apparent he was simply dodging having to take direct responsibility for any wrongdoing, Nixon and his lawyers argued that the separation of powers meant that Congress had no blanket authority to reach into the president’s personal conversations, many of which may involve matters of national security or international relations. But the Democratically-controlled House and Senate were having none of that, and Sirica confirmed that he too would not budge—the White House would have to relinquish those eight tapes. Nixon offered what became known as the Stennis Compromise, whereby Nixon would loan the tapes to U.S. Senator John Stennis. Stennis could, acting separately and independently, produce his own executive summary and analysis of the tapes to confirm or refute accusations that the President had acted illegally. Cox refused to agree to this arrangement, as did Judge Sirica. Angry that Cox would not budge, late that night, Nixon asked Attorney General Elliot Richardson to fire Cox from his job. Richardson refused to fire Cox, and instead resigned on the spot. Within the hour, Richardson’s second-in-command, William Ruckelshaus, also resigned when the task fell to him to fire Cox. Next in the line-of-succession came Robert Bork, then Solicitor General. Bork acquiesced to Nixon’s demand and fired Archibald Cox. The event became known as the Saturday Night Massacre. Eventually, after various legal tactics collapsed and several delaying tacks failed, and under extreme political pressure, Nixon and his team released heavily edited transcripts of more than one hundred conversations. Nixon never conceded that Congress had the right to listen to the tapes, but he was willing to produce dozens of ring binder notebooks filled with 1200 pages of edited versions of those office encounters. The transcripts were rushed into print, and bookstores from San Diego to Seattle, from Manchester to Miami, built Vokswagen-sized displays of the paperback versions. For a brief moment in 1974, those thick paperbacks became the biggest-selling non-hardback book on the best seller lists. Nixon was embarrassed for the nation to see the kind of language and tone being used in the Oval Office, and he ordered his staff to expurgate the dozen or more offensive words so liberally peppered throughout the transcripts. Also exorcised were the racial epithets and religious slurs, only one part of the pattern of talk often employed by Nixon and his closest aides, Bob Haldeman, John Erlichman, John Dean, John Mitchell, Charles Colson and others. Notable writers and columnists decried the lowball synergy inside Nixon’s inner circle—small talk, cheap talk, backbiting gossip, coarse language. To those entering his office, Nixon would often offer disparaging remarks or insults about the person who had just left the room. In place of the ship’s boiler-room language and the crass insults and the ethnic slurs, phrases like “expletive deleted” or “characterization deleted” were employed. Instead of providing delicacy, the self-censorship only heightened the lack of intellectual debate and the absence of broad-minded governance—the leaders of the free world cursing and back-stabbing. But the tapes also generated a strange ambiguity. In the press and among his many adversaries in a Democratically-controlled Congress, there had been a previously naïve narrative that the tapes would produce unambiguous gotcha moments—instances where those in the room with Nixon agree out loud and in clear language that they intend to commit a crime. But Nixon’s obsession with driving the conversations, along with his well-known tendency to think out loud—bouncing ideas, testing reactions, sizing-up outcomes, vetting and venting, even pulling back from some of the more absurd discussions—meant that for every occasion where Nixon seemed to be crossing the threshold into illegality, there was an instance moments later when he withdraws, demurs, or changes his mind. The transcripts became 800 pages of Rorschach testing. Nixon’s closest allies in Washington, while appalled at the lowball comedy of what was on those tapes, could also find a man in an innocent state of mind even as he manipulated and maneuvered. One could read into some of the passages what one expected to find, and a nation already deeply-divided by the scandal grew more divided. Those big selling paperback books also made for strange reading. The men around Nixon used a kind of shorthand or coded language, at times filled with a mishmash of Army and Navy lingo, advertising jargon, football vernacular, and political Pig Latin, all of which which required contextual translation. There was “go the hang-out route” and “get it in the neck” and “deep six” and “beating the rap.” There was “over the hill” and “twisting in the wind.” There was “the long ball” and there were “Hail Mary Passes.” Accompanying the “expletives” were thousands of “inaudible” and “unintelligible” insertions. When hashed together with frequency, some passages become theater of the absurd, like reading Waiting for Godot backwards and with every fifth word deleted. The books became the brunt of ten thousand jokes, the grist for hundreds of political cartoons, and the source of endless hours of reading for the millions who bought the book. It became history’s most unlikely political best seller, and a strange, jarring look inside the White House and into the siege-mentality thought processes of Nixon. The transcripts, more importantly, showed a President and a top staff increasingly consumed by Watergate. Over time, weeks and months later, Nixon slowly defers much-needed work on hundreds of crucial issues—from Vietnam to the Soviets to China, from oil prices to wages to inflation. Even Nixon’s most savvy and thoughtfully-crafted endeavors in foreign policy begin to suffer, and the transcripts show the slow deterioration of Nixon’s focus and the growing isolation he felt. Most critically, the book still reads in places like tragedy—Shakespearean in the certainty the heroes will all fall, either on their own sword, or by the sword of another. Each man (and in those days Nixon’s White House was all men) eventually must save his own skin, or take the fall to protect the next in the line of succession—protagonists and antagonists alike angling to the last. The release of the transcripts was, to use one of Nixon’s office phrases, a Hail Mary Pass. He was betting that those binders would sate the hunger of a hostile Congress and assuage an irritated court. He was also placing his chips on the court of public opinion. In his address to the American people the night he released the transcripts, Nixon said he knew the tapes would show him to be some who was trying to find the right path. He hoped that most Americans would see his actions as being honorable, though he knew it would be painful for his place in the history books. “Never before in the history of the Presidency,” Nixon said that night, “have records that are so private been made so public. In giving you these records—blemishes and all—I am placing my trust in the basic fairness of the American people.” – See more at: http://www.thursdayreview.com/WhiteHouseTranscripts.html#sthash.2XZIprvE.dpuf