January 20, 2014 | Europe’s comet-chasing spacecraft woke up after a 957-day-long hibernation to begin the most comprehensive comet study to date. Part of its mission: attempt to place an instrumented lander on a comet’s nucleus for the first time. > read more
January 23, 2014 | There’ll be a new wrinkle facing NASA’s Dawn spacecraft when it reaches Ceres next year: what’s causing this big round ball to give off puffs of water vapor? > read more
January 24, 2014 | Can you believe it? A robotic rover designed to last 90 days on the Red Planet is celebrating 10 years of successful exploration on the Red Planet — even taking a “selfie” for its handlers back on Earth. > read more
January 24, 2014 | A new image of the Lagoon Nebula from the Paranal Observatory in Chile provides a stunning view the object, which lies Sagittarius at a distance 5000 light-years from Earth. > read more
January 2, 2014 | Join astronomers in two new citizen science projects, Space Warps and Planet Four, that will have you investigating the warped light from faraway galaxies and the ever-changing Martian landscape. > read more
January 22, 2014 | Supernova 2014J has erupted to 11th magnitude in the galaxy M82 in Ursa Major. It’s visible in amateur telescopes in the evening sky. > read more
December 27, 2013 | Start the new year right with a little evening stargazing! Venus is dropping from sight low in the west just as Jupiter and mighty Orion are ascending in the east. > read more
January 22, 2014 | Join Sky & Telescope on the aurora adventure of a lifetime in October, 2014! Walk through a rift valley, witness magnificent waterfalls and the Strokkur geyser, bathe in the Blue Lagoon, and best of all, maximize your chances of seeing the beautiful Northern Lights. > read more
January 21, 2014 | For only the second time in its 73-year history, ownership of Sky & Telescope has changed hands. On Friday, January 17th, F+W Media, Inc. acquired New Track Media, LLC, the parent company of Sky & Telescope magazine. > read more
January 24, 2014 | Mercury after sunset, and Venus before sunrise, have both climbed up into excellent view. After nightfall, Jupiter continues to dominate the moonless evening sky. > read more
The United Nations’ (UN) International Day of Commemoration in Memory of the Victims of the Holocaust remembers those who died and suffered during the Holocaust before and during World War II. It is on January 27 each year.
Local names
Name
Language
International Day of Commemoration in Memory of the Victims of the Holocaust
English
Día Internacional de Commemoración anual en memoria de las víctimas del Holocausto
Spanish
יום הזיכרון הבינלאומי לשואה
Hebrew
يوم ذكرى المحرقة الدولي
Arabic
국제 홀로코스트 희생자 추모의 날
Korean
Tag des Gedenkens an die Opfer des Nationalsozialismus
International Day of Commemoration in Memory of the Victims of the Holocaust 2015
Tuesday, January 27, 2015
On January 27 each year, the United Nations (UN) remembers the Holocaust that affected many people of Jewish origin during World War II. This day is called the International Day of Commemoration in Memory of the Victims of the Holocaust.
The day also commemorates when the Soviet troops liberated the Nazi concentration and death camp Auschwitz-Birkenau in Poland on January 27, 1945. It is hoped that through remembering these events, people will remember the Holocaust and prevent genocide.
Holocaust survivors and various leaders make their voices heard on the International Day of Commemoration in Memory of the Victims of the Holocaust. Many of them speak publicly about the Holocaust or their experiences around the event, its aftermath and why the world should never forget what happened in Europe in the 1930s and 1940s. Many statements emphasize the need for future generations to learn about and remember the Holocaust and for everyone to work towards preventing genocide.
The UN organizes and supports events such as: concerts by musicians who survived the Holocaust or are survivors’ descendants; art exhibitions influenced by the Holocaust; presentations of special stamps; the introduction of special educational programs; and film screening and book signing focused on the Holocaust.
Israel and many countries in Europe and North America mark the International Day of Commemoration in Memory of the Victims of the Holocaust. Many academics present discussion papers or hold seminars or round table discussions on the Holocaust and its legacy in the modern world. Schools or colleges may also have special lessons on the Holocaust. The Holocaust and how people commemorate it receive special attention on the Internet, television, radio, print media.
Public life
The International Day of Commemoration in Memory of the Victims of the Holocaust is a global observance and not a public holiday.
Background
The Holocaust, or Shoah (Sho’ah, Shoa), is the term used to describe the deliberate murder and desecration of millions of people prior to and during World War II in Germany and German occupied areas in Europe. Many of them were Jewish but the Roma people, Soviet civilians and prisoners of war, ethnic Poles, people with disabilities, homosexuals and political and religious opponents were also killed. Many people died in concentration and death camps spread across Nazi-occupied Europe. One of the most notorious camps was Auschwitz-Birkenau, near Oświęcim, Poland. More than one million people died in Auschwitz-Birkenau before Soviet troops liberated it on January 27, 1945.
On January 24, 2005, the UN General Assembly commemorated the 60th anniversary of the liberation of the Nazi concentration camps. Following this session, a UN resolution was drafted to designate January 27 as the International Day of Commemoration in Memory of the Victims of the Holocaust. The resolution called for education programs on the Holocaust to help prevent genocide. It also rejected denials that the Holocaust occurred. On November 1, 2005, the assembly adopted this resolution so the day could be observed each year. It was first observed on January 27, 2006.
Many Jewish groups, particularly in Israel, also observe Yom HaShoah, which is a day of mourning for Holocaust victims on 27th day of the Hebrew month of Nisan, which falls in April or May of the Gregorian calendar.
Symbols
The symbol of the “Holocaust and the United Nations Outreach Programme” consists of four elements on a solid black background. Two elements are the words “Remembrance and Beyond” and the UN symbol, both depicted in white. The UN symbol consists of a projection of the globe centered on the North Pole surrounded by two olive branches.
The other two elements are a piece of barbed wire and two white roses. The strands of the barbed wire merge into the stems of the roses. The barbed wire represents: the concentration camps; the loss of freedom of Jewish people and many other groups before and during World War II; and their pain and suffering.
The white roses represent peace, freedom and remembrance. These flowers also remind people of the White Rose, a non-violent resistance movement that was active in Germany from June 1942 until February 1943. In the United States and United Kingdom, white roses symbolize the investigation, remembrance and prevention of genocide.
International Day of Commemoration in Memory of the Victims of the Holocaust Observances
Weekday
Date
Year
Name
Holiday type
Where it is observed
Thu
Jan 27
2000
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Sat
Jan 27
2001
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Sun
Jan 27
2002
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Mon
Jan 27
2003
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Tue
Jan 27
2004
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Thu
Jan 27
2005
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Fri
Jan 27
2006
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Sat
Jan 27
2007
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Sun
Jan 27
2008
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Tue
Jan 27
2009
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Wed
Jan 27
2010
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Thu
Jan 27
2011
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Fri
Jan 27
2012
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Sun
Jan 27
2013
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Mon
Jan 27
2014
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Tue
Jan 27
2015
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Wed
Jan 27
2016
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Fri
Jan 27
2017
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Sat
Jan 27
2018
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Sun
Jan 27
2019
International Day of Commemoration in Memory of the Victims of the Holocaust
United Nations observance
Mon
Jan 27
2020
International Day of Commemoration in Memory of the Victims of the Holocaust
Mr. Dobson in 2003, demonstrating his inexpensive telescope for amateur astronomers on a sidewalk in Portland, Ore. Garth Eliassen/Getty Images
Hour after hour, night after night, decade after decade all over planet Earth, John Dobson rolled his homemade telescopes to street corners and national parks to show people the heavens. “Look at Saturn,” he would say. “No charge.”
He gave hundreds of thousands of people a fresh view of the stars, prompting Smithsonian magazine to describe him as a “carny barker for the cosmos.” A lanky figure with a ponytail, he toured with his road show in a creaky former school bus, which he called Starship Centaurus A, after a galaxy. It towed one of his bulkier creations, a telescope as large as a midsize automobile.
Mr. Dobson, who died last Wednesday at 98 — or, as he might have put it, 123 days into his 99th orbit around the sun — is credited with developing the first high-powered portable telescope that amateur astronomers could build inexpensively, and tens of thousands have done so. Dobsonian telescopes, as they are known generically, are still a popular item on the market, though Mr. Dobson chose not to benefit from them commercially.
John Dobson lecturing in San Francisco in the 1980s. Mark Leet
He also founded a stargazing club, Sidewalk Astronomers, which announced his death, in Burbank, Calif. The organization now has chapters on every continent but Antarctica. He wrote books with inviting titles (“Astronomy for Children Under 80” is one) and appeared on Johnny Carson’s “Tonight Show.” In 2005 he was the subject of a documentary feature, “A Sidewalk Astronomer,” directed by Jeffrey Fox Jacobs.
Most compelling to him was divining what “the whole ball of wax” means. He delved into matters like the origin of the universe with both passers-by on the street and astrophysicists. He denounced the Big Bang theory on the ground that something cannot come from nothing — a view contrary to what many scientists believe — and wrote equations that he contended proved his point.
All this was perhaps par for the course for a man who spent 23 years living as a monk in a monastery of the Vedanta Society, a Hindu-inspired order noted for its intellectual rigor and vows of chastity. The abbot there assigned him to reconcile science and religion, and it was this mission that prompted him to scrounge through trash for materials to make his first telescope.
John Lowry Dobson was born on Sept. 14, 1915, in Beijing, where his parents were Methodist missionaries. As a child, he said, he lay on his back, gazed upward and imagined the sky as a vast ocean.
After leaving China because of political unrest, the family settled in San Francisco, and Mr. Dobson attended the University of California, Berkeley, graduating with a chemistry degree. Afterward he joined the Ramakrishna monastery in Sacramento, Calif., where he led worship services and cared for the flowers.
The head swami assigned him to spend the rest of his life reconciling ancient Hindu scripture with modern physics, Mr. Dobson said. “I don’t know what your problems are, but that was mine,” he was quoted as saying in a biography prepared by friends.
It was as part of this quest that he decided to make a telescope to look at the universe. As material he used plywood, cardboard tubes, glass from ship portholes and even cereal boxes. What resulted was essentially the same as the telescope Newton had developed in the 17th century: a tube with a concave mirror at the bottom to gather light, and a flat secondary mirror near the top to bounce light out to the eyepiece.
Mr. Dobson’s chief innovation was creating an axis at the base on a wooden mount that could move not just up and down but also sideways, like a cannon.
Mr. Dobson never sought a patent on his design or a copyright for the name, saying he did not care about money and wanted the telescopes distributed as widely as possible. Commercial manufacturers, seizing on the design, eventually did, selling versions in kits. Amateurs used them to see phenomena previously visible only to professional astronomers — precisely as Mr. Dobson had hoped. He said he had always wanted to share the exhilaration he felt at seeing, for the first time, in close-up, a three-quarters-full moon through a telescope he had made.
“It looked as if we’re coming in for a landing,” he said. “I thought, everybody has to see this.”
The abbot expelled Mr. Dobson in 1967, saying he was spending too much time outside the monastery with his telescopes. He left with only a $50 bill, slept on friends’ floors in San Francisco and foraged for food in Golden Gate Park. Though he lectured regularly, he never had a steady source of income. He told The Los Angeles Times in 2005 that the last year he had paid income tax was 1944.
Mr. Dobson had a son, Loren, with Ruth Ballard, a professor of genetics at Sacramento State University. They both survive him.
Mr. Dobson had a knack for phrasemaking that delighted audiences at the national parks he often visited. At Yellowstone, he was asked if the sky was part of the park. “No,” he said, “the park is part of the sky.”
His long view was long indeed. Human bodies, he told an audience, are made of stardust. He pointed to a photo of a nebula.
“If you give this cloud another 10 billion years,” he said, “it will go to school and chew gum.”
Correction: January 24, 2014
A picture credit with an earlier version of this obituary misidentified the photographer who took the picture of Mr. Dobson delivering a lecture. The photograph was taken by Mark Leet, not Gerard Pardeilhan. SOURCE*******************************************************
Leslie Lee, a playwright whose award-winning work, much of it with the Negro Ensemble Company, focused on stretching the boundaries of the African-American experience as it was portrayed on the stage, died on Monday in Manhattan. He was 83.
The cause was congestive heart failure, Heather Massie, a friend, said.
Over four decades, Mr. Lee wrote more than two dozen stage works, scouring American history for his subjects and characters. In “Black Eagles,” he wrote about black fighter pilots in Italy in World War II. In “Ground People” (originally titled “The Rabbit Foot”), he wrote about Southern black sharecroppers and visiting minstrel-show performers in the 1920s.
In “Blues in a Broken Tongue,” the daughter of a family that had moved to Russia in the 1930s as an escape from racism discovers a pile of recordings by Billie Holiday, Paul Robeson and others and reconsiders her heritage. An early play, “The War Party,” was about the conflicts within a community civil rights organization in the 1960s.
A scene from the 2008 revival of his play, “The First Breeze of Summer,” by the Signature Theater Company. The 1975 Broadway production was nominated for a Tony. Hiroyuki Ito for The New York Times
In “The Book of Lambert,” written in the 1970s and set contemporaneously on an abandoned New York subway platform, a black intellectual has been reduced to despair by the loss of the white woman he loves. In “Colored People’s Time,” Mr. Lee presented a century of black history, from the Civil War to the dawn of the civil rights movement, in a pageantlike parade of vignettes.
“One can be black and also many other things,” Mr. Lee said in a 1975 interview about his writerly concerns. “I want to expand the thinking of blacks about themselves.”
Most of Mr. Lee’s work was produced Off Broadway and on regional stages, though his best-known play, “The First Breeze of Summer” (1975), appeared on Broadway, at the Palace Theater, after moving from the St. Mark’s Playhouse, then the home of the Negro Ensemble Company, in the East Village. It was nominated for a Tony Award for best play. (Tom Stoppard’s “Travesties” was the winner.)
“The First Breeze of Summer” tells the story of a middle-class black family in Pennsylvania whose ambitious and sensitive younger son is emotionally derailed when he learns the past secrets of the grandmother he reveres. Mr. Lee acknowledged that it was an autobiographical work. And at a time when black theater was often polemical, it was notable for its naturalistic drama and its probing of family dynamics and character.
That it had its debut in an earlier era, both theatrically and journalistically, was evident in Walter Kerr’s review in The New York Times.
“For all the explicitly black experience detailed in ‘The First Breeze of Summer,’ ” Mr. Kerr wrote near the conclusion of an unqualified rave that was redolent of surprise, “I have rarely seen a play at which someone who is not black can feel so completely at home.”
Leslie Earl Lee was born on Nov. 6, 1930, in Bryn Mawr, Pa., and grew up nearby in West Conshohocken, one of nine children. His mother, the former Clementine Carter, was a homemaker; his father, John Henry Lee, like the patriarch in “First Breeze,” was a plastering contractor.
Mr. Lee studied English and biology at the University of Pennsylvania — he thought he would be a doctor — and worked as a hospital medical technician, as a bacteriologist for the state health department and as a researcher for Wyeth, the pharmaceutical company, before abandoning his scientific pursuits in the mid-1960s to study playwriting at Villanova University. (For a time, his roommate was David Rabe, who went on to his own award-winning playwriting career).
Mr. Lee taught writing at several colleges, including New York University, and wrote several television scripts, including an adaptation of Richard Wright’s short story “Almos’ a Man.” “The First Breeze of Summer” was broadcast as part of the “Great Performances” series on public television.
His other stage work includes two collaborations with the composer Charles Strouse and the lyricist Lee Adams, creators of “Bye Bye Birdie,” “Applause” and other shows. Together they updated another Strouse-Adams show, “Golden Boy,” the 1964 musical based on Clifford Odets’s boxing drama; the newer version, with Mr. Lee’s book, was presented in 1989 at the Coconut Grove Playhouse in Florida.
The three men also worked on a musical about the Rev. Dr. Martin Luther King Jr. that follows Dr. King from his teenage years in Atlanta to the Montgomery bus boycott of the 1950s. The show had its premiere Off Broadway at the Kraine Theater in 2011.
Mr. Lee won numerous Audelco Awards, given to black theater artists and productions. He was married once and divorced. He is survived by a brother, Elbert, and three sisters, Evelyn Lee Collins, Grace Lee Wall and Alma Lee Coston.
In 2008, “The First Breeze of Summer” was revived Off Broadway by the Signature Theater Company in a production that starred Leslie Uggams and was directed by Ruben Santiago-Hudson.
“He captured African-American life with all its frailties and all its power,” Mr. Santiago-Hudson said in a telephone interview on Wednesday. “Most of all he bestowed integrity on people, even when they were ne’er-do-wells or people whose intentions weren’t the best for other folks. Leslie wasn’t only poetic; he was authentic.”
SARAH MARSHALL, ACTRESS IN ‘TWILIGHT ZONE’ AND ‘STAR TREK’
By DANIEL E. SLOTNIK
JAN. 25, 2014
Sarah Marshall with William Shatner on the set of Star Trek. CBS Paramount Television, via Photofest
Sarah Marshall, an actress who was born into show business and worked on Broadway, in film and on television with a galaxy of big names, perhaps most memorably in episodes of “The Twilight Zone” and “Star Trek,” died on Jan. 18 at her home in Los Angeles. She was 80.
The cause was stomach cancer, said her grandson, Seamus Marshall Bourne.
Ms. Marshall was the only daughter of the British film and theater stars Herbert Marshall and Edna Best. She left private school at 16 to pursue acting full time, with her mother’s help.
“We decided acting was a better education than school,” she was quoted in Sidney Fields’s syndicated column “Only Human” in 1958.
A winsome young woman, she was often cast as an ingénue. She performed opposite José Ferrer in the 1953 Broadway revival of the cross-dressing farce “Charley’s Aunt” and won a Theater World Award for her work in the 1956 play “The Ponder Heart,” based on a Eudora Welty story.
She was nominated for a Tony for her performance in George Axelrod’s 1959 comedy “Goodbye, Charlie,” which also starred Lauren Bacall and Sydney Chaplin.
“There is a little gem of malicious acting by Sarah Marshall, whose honeyed style is spiked with vinegar,” Brooks Atkinson wrote in his review of that play in The New York Times.
Ms. Marshall’s first film was “The Long, Hot Summer” (1958), with Paul Newman and Joanne Woodward. She appeared with Kevin Kline and Sigourney Weaver in Ivan Reitman’s political comedy “Dave” (1993) and with Michelle Pfeiffer in “Dangerous Minds” (1995).
She was a mainstay on television, appearing on shows from “Alfred Hitchcock Presents” to “Cheers.” In 1962 she played a woman whose daughter vanishes into the fourth dimension in the “Twilight Zone” episode “Little Girl Lost,” and in 1967 she played a former love interest of William Shatner’s Capt. James T. Kirk in the “Star Trek” episode “The Deadly Years.”
Ms. Marshall was born in London on May 25, 1933. After her parents divorced in 1939, she and her mother moved to Los Angeles. In 1952 she married the set designer Melvyn Bourne. The marriage ended in divorce.
In 1958 she met the actor Karl Held while performing in “The World of Suzie Wong” on Broadway. They were married in 1964. He survives her, as do a son from her first marriage, Timothy M. Bourne, and four grandchildren.
Millard L. Midonick, previously of Family Court, being sworn in to Surrogate’s Court in 1972. Don Hogan Charles/The New York Times
Millard L. Midonick, a former Manhattan Family Court judge and surrogate who decided numerous celebrated estate cases, including those of the poet W. H. Auden and the painter Mark Rothko, died on Jan. 18 in Manhattan. He was 99.
His death was confirmed by his wife, Jill Claster Midonick.
Judge Midonick was a lifelong progressive who in 1953, in the aftermath of Adlai Stevenson’s first failed presidential campaign, helped found the Samuel J. Tilden Democratic Club, an organization on the East Side of Manhattan that supports reform-minded political candidates. A lawyer who handled labor arbitrations, and trusts and estates cases, he was appointed temporarily to the municipal bench by Mayor Robert F. Wagner in 1956. He became a Family Court judge in 1962, and in 1971 he was elected to the Surrogate’s Court.
In Family Court, Judge Midonick looked out especially for the rights of children, even those on the verge of adulthood. In one well-publicized case in 1970 in which he was reversed on appeal, he ordered the father of a college student to continue supporting her financially after he stopped paying her bills because her grades had fallen and, against his wishes, she had moved out of her dormitory. The father, a lawyer himself, berated his daughter in court, saying she no longer deserved his support because she had become a “hippie” who “stinks.”
“At some point,” Judge Midonick said in making his ruling, “minors must have some right to their own views and needs for their independent and painful transition from minority to adulthood, short of matching every fancy of their parents.”
The Appeals Court, in contrast, favored the right of the parent, declaring that “the father — in return for maintenance and support — is entitled to set reasonable standards, rules and regulations for his child.”
In one of his final acts on the Family Court bench, Judge Midonick struck a blow for female rape victims, joining a growing chorus of feminist critics of a New York statute that severely limited the state’s ability to prosecute violent sex crimes. (In 1969 there were 1,085 arrests for rape in New York City, resulting in 18 convictions.)
Forced by what he called “Victorian rules” to dismiss rape charges against two 15-year-olds despite his belief that the charges were “proven beyond a reasonable doubt,” Judge Midonick, in his written opinion, assailed the so-called corroboration requirement, which made it impossible to convict an accused rapist unless every material element of the attack — including the identity of the attacker and that the attacker used force — was established by forensic evidence or testimony by someone other than the victim.
“The corroboration requirement denigrates the testimony of women who claim to have been victimized sexually,” Judge Midonick wrote in December 1971. He added, “The sole object of this opinion is to expose again, and to persuade the Legislature to rectify, the miserable state of the law in respect to the requirement for corroboration in cases of sexual assault.”
Three months later, the State Assembly passed a bill modifying the corroboration requirement, and in 1974 Gov. Malcolm Wilson signed a bill eliminating it altogether.
Judge Midonick was elected to the Surrogate’s Court, which administers matters regarding affairs of the dead and their descendants, as a candidate who promised to put an end to a patronage system in which judges appointed political cronies to handle lucrative estate cases. He also pressed for the establishment of an Office of Public Guardian to represent infants and children unable to choose their own lawyers. (The responsibility now falls to the New York State Court System Department of Guardian and Fiduciary Services.) In his nearly 11 years as a surrogate, he handled hundreds of estate cases, many of whose tangled disputes ended up as front-page news articles.
His most famous case was that of Rothko and a years-long dispute over more than $30 million worth of his paintings. A leading abstract expressionist, Rothko killed himself in 1970.
A year later, a suit brought on behalf of his children, Kate and Christopher, charged that their father’s executors, along with a gallery that had contracted to sell 798 Rothko paintings, had cheated them.
The suit, which involved some 500 exhibits and 20,000 pages of testimony, was finally decided in 1975 when Judge Midonick found that the executors had been negligent in selling and consigning the paintings to the gallery for less than their true value. He removed the executors, replacing them with Kate Rothko, and assessed damages and fines of more than $9 million.
“This was a multimillion-dollar case,” he said after his ruling, “but I’ve handled thousands of cases involving neglected children and heart-rending adoption cases involving parents and real parents. I continue to be interested in human beings.”
In the Auden case, Judge Midonick ruled that an archive of the poet’s notebooks and papers rightfully belonged to the New York Public Library, which had received them from Auden’s longtime partner, Chester S. Kallman, rather than to Mr. Kallman’s father, Dr. Edward Kallman, who after his son’s death sued the library for the return of the papers. Among the evidence considered by Judge Midonick was an Auden poem that declared:
Shameless envious Age!, when the Public will shell out more cash for
Note-books and sketches that were never intended for them
than for perfected works. Observing erasures and blunders,
every amateur thinks: I could have done it as well.
Millard Lesser Midonick was born in Manhattan on May 24, 1914, and reared by his father, Abraham, a lawyer, and his mother, Ida Lesser, there and in Ardsley, N.Y.
Known to friends as Will — a nickname that came about when he was a boy and his younger sister could not properly pronounce his name — he received his undergraduate and law degrees from Columbia and then worked for the National Labor Relations Board. During World War II he was in the Coast Guard, achieving the rank of lieutenant commander and for a time serving as commanding officer of the U.S.S. Brownsville, a patrol frigate operating off the California coast.
He married Dorothy Rosenberg in 1941; she died in 1976. He is survived by his wife, a professor of history and former dean of the College of Arts and Sciences at New York University, whom he married in 1979.
Complaining about judges’ poor salaries and mandatory retirement age of 70, Judge Midonick stepped down from the bench in 1982, two years short of that age, and joined the firm of Willkie Farr & Gallagher. He later became counsel to another firm, Fensterstock & Partners.
“I’ve decided not to grow old,” he said after leaving the bench. “I find that when people retire, I get their wills in three years.”
Three men were arrested earlier this week after Los Angeles County sheriff’s deputies uncovered a secret bunker hidden under a house that was filled with guns, ammunition and white supremacist paraphernalia in Littlerock, Calif., about 65 miles northeast of Los Angeles.
Inside the soundproof underground bunker, investigators found a 25-yard shooting range, six pistols, 11 rifles, a World War II-era machine gun, more than 1,000 rounds of ammunition, over 100 magazines (some high-capacity), Nazi flags and pictures of at least one of the men posing in Nazi attire. Some of the weapons, like the machine gun, were illegal, and others were stolen, authorities said.
“It’s not something that anybody we’ve ever worked with has seen in their careers in law enforcement,” Sheriff’s Det. Julia Vezina told an NBC affiliate in Los Angeles. “When you open up the hatch, you look down and about 10 feet down, all concrete reinforced walls, soundproof with bars.”
Police took three men into custody after the Jan. 8 discovery: Royce Gresham, 33, Todd Hunt, 54, and Larry Finnell, 62, who lived in the house where the bunker was discovered. The men were charged with weapons violations and were to be arraigned at the Antelope Valley courthouse.
Deputies began their investigation last month after four guns were stolen from a storage unit in Palmdale. The information they gathered led them to the three men earlier this week.
While no evidence has yet emerged that the men were part of any hate group, at least one neighbor had noticed one of the men’s radical beliefs. “Larry was a survivalist,” Dale Snide, who lived next door to Finnell, told NBC4. “He’s concerned about the direction our government is going now.”
Lisa Waldron, 50, who lives with Finnell, claimed to Hatewatch that none of the men are tied up with Nazis or hate ideologies. She said that the bunker was well known, and that the police had raided it before. “It’s a workshop. It’s not a bomb shelter,” Waldron said. “You’re lucky if you’re a stick man and you’re able to get down there and look at the range. It’s a crawl hole.”
Several messages left yesterday with the Los Angeles County Sheriffs Department were not immediately returned.
Sheila Guyse, center, performing in “Lost in the Stars.” George Karger/Pix Incorporated
Sheila Guyse, a popular actress and singer who appeared on Broadway and in so-called race movies in the 1940s and ’50s, and who for a time, despite limited opportunities in the entertainment industry, appeared headed for broader fame, died on Dec. 28 in Honolulu. She was 88.
The cause was complications of Alzheimer’s disease, her daughter Sheila Crystal Devin said.
For several years, Ms. Guyse (rhymes with “nice”) was compared to stars like Dorothy Dandridge, Lena Horne and Ruby Dee, black actresses who broke through racial barriers. But by the late 1950s she was out of show business, a result of some combination of health problems, a religious conversion and family obligations.
She left behind a handful of films. The best is probably “Sepia Cinderella” (1947), in which she played a girl-next-door who is initially overlooked by the musician she loves, played by the singer Billy Daniels. She also appeared in Broadway musicals and in nightclubs. Her only album, “This Is Sheila,” a collection of standards released by MGM Records in 1958, a decade after her heyday, was supposed to be a comeback. That November, Jet magazine put her on its cover.
Sheila Guyse
“Sheila Guyse, a glamorous, high-octane performer under supper club spotlights,” the article said, “is a singer who has had to overcome serious illness, marriage failures, financial pressures and professional disappointments in her long campaign to create a career in show business.”
The article quoted Ms. Guyse as saying, “I was discouraged and depressed for a while, but now life looks a lot better to me,” and mentioned a five-year recording contract. But the comeback never happened.
Ms. Guyse, who had surgery for bleeding ulcers in the mid-1950s, continued to have health problems. Ms. Devin, her daughter, recalled once finding her collapsed in her bedroom, bleeding from the mouth.
In addition, Ms. Guyse’s husband did not want her to have a career, Ms. Devin said.
Ms. Guyse’s first two marriages had ended in divorce, and she was a struggling single mother when she met Joseph Jackson, a New York sanitation worker so enthralled by her that he would sometimes follow her in his garbage truck. After they married, in the late 1950s, Ms. Guyse stopped performing and became increasingly involved with a Jehovah’s Witness hall in Queens.
“It wasn’t easy to be a glamorous movie star with people following you for your autograph and now you’re home making pancakes,” Ms. Devin said. “She did it, but I don’t think it was easy.”
Etta Drucille Guyse was born on July 14, 1925, in Forest, Miss. She took Sheila as a stage name. She followed her father, Wilbert, to New York when she was a teenager and, her daughter said, lived for a time in a Harlem rooming house with Billie Holiday.
After winning an amateur contest at the Apollo Theater, Ms. Guyse had a small role on Broadway in the musical “Memphis Bound!” and appeared in a series of all-black films, beginning with a small role in “Boy! What a Girl!” (1947), which starred the vaudeville performer Tim Moore. She moved on to starring roles in “Sepia Cinderella” and “Miracle in Harlem” (1948), in which she played a woman wrongly accused of murder.
She also appeared in the Broadway musicals “Finian’s Rainbow” (1947) and “Lost in the Stars” (1949).
In addition to Ms. Devin, who has worked as a model and actress under the name Sheila Anderson, Ms. Guyse is survived by another daughter, Deidre Devin, from her marriage to Mr. Jackson; two grandchildren; and four great-grandchildren. A son, Michael Jackson, died a few years ago. Joseph Jackson died in 2012.
Ms. Guyse moved back to Mississippi in the 1980s and to Hawaii about five years ago.
Her first marriage, to Ms. Devin’s father, a tailor named Shelby Irving Miller, was very brief. Her second, to Kenneth Davis, whom she had met while both performed in “Finian’s Rainbow,” lasted eight years. Mr. Davis, who was white, became a dancer with American Ballet Theater. In 1952, a photograph of the couple appeared on a cover of Jet with the headline “Negro Women With White Husbands.”
“I don’t go about looking for difficulties,” Ms. Guyse said in the article. “It took me a long time to decide to marry Ken, but I’m glad I did. We’ve been very happy. Intelligence and understanding are needed to make a marriage like ours succeed. It takes more than love. You have to have a mind of your own and be able to ignore what the world is saying and thinking about you.”
Russell Johnson during filming of a 1978 “Gilligan’s Island” reunion show. Wally Fong/Associated Press
Russell Johnson, an actor who made a living by often playing villains in westerns until he was cast as the Professor, the brains of a bunch of sweetly clueless, self-involved, hopelessly naïve island castaways, on the hit sitcom “Gilligan’s Island,” died on Thursday at his home in Bainbridge Island, Wash. He was 89.
His agent, Michael Eisenstadt, confirmed the death.
“Gilligan’s Island,” which was seen on CBS from 1964 to 1967 and still lives on in reruns, starred Bob Denver as Gilligan, the witless first mate of the S.S. Minnow, a small touring boat that runs aground on an uncharted island after a storm.
Besides Gilligan and the Professor, five others were on board: the Skipper (Alan Hale Jr.); Ginger, a va-va-voom movie star (Tina Louise); the snobbish wealthy couple Thurston Howell III (Jim Backus) and his wife, known as Lovey (Natalie Schafer); and Mary Ann, the prototypical girl next door (Dawn Wells).
In the show’s first season, Mr. Johnson and Ms. Wells were left out of the opening credits and their characters were ignored in the theme song, which named the other castaways but dismissed the two of them with the phrase “and the rest.” The snub was rectified for the second season, at the same time that the show went from black and white to color.
Russell Johnson, center, with Alan Hale Jr., left, and Bob Denver in an episode of the 1960s CBS sitcom “Gilligan’s Island.” CBS, via Photofest
The Professor was a good-looking but nerdy academic, an exaggerated stereotype of the man of capacious intelligence with little or no social awareness. Occasionally approached romantically by Ginger (and guest stars, including Zsa Zsa Gabor), he remained chaste and unaffected.
But he was pretty much the only character on the show who possessed anything resembling actual knowledge, and he was forever inventing methods to increase the castaways’ chance of rescue. Still, among the show’s many lapses of logic was the fact — often noted by Mr. Johnson in interviews — that although the Professor could build a shortwave radio out of a coconut shell, he couldn’t figure out how to patch a hole in a boat hull.
Avid fans — very avid — are probably the only ones to remember that the character’s name was actually Dr. Roy Hinkley, or that his academic résumé was explicitly spelled out.
“Professor, what exactly are your degrees?” Mr. Howell asked once.
“Well,” the Professor replied, “I have a B.A. from U.S.C., a B.S. from U.C.L.A., an M.A. from S.M.U. and a Ph.D. from T.C.U.”
Mr. Howell clucked in return: “Well, I don’t know much about your education, but it sounds like a marvelous recipe for alphabet soup.”
Russell David Johnson was born on Nov. 10, 1924, near Wilkes-Barre, Pa., the oldest of six children. His father died when Russell was not yet 10, and his mother sent him and two brothers to Girard College, then a school for poor orphan boys, in Philadelphia, where he finished high school. He served in the Army Air Forces during World War II, receiving a Purple Heart, and after his discharge studied on the G.I. Bill at the Actors’ Laboratory in Hollywood.
His first film role was in a 1952 drama about fraternity hazing, “For Men Only,” in which he played a sadistic fraternity leader; that led to a contract with Universal-International, which led to roles in a series of movies, mostly westerns (including “Law and Order,” in which he played Ronald Reagan’s no-good brother) and science fiction films, including “It Came From Outer Space.”
Later in the decade he began appearing frequently on television, often in western shows in the role of the black hat, even though he was a poor horseman. (When he played a marshal in the series “Black Saddle,” he suggested to the producer — “semi-seriously,” he said in an interview in 2004 — that the character be seen walking his horse into town and that he chase down the bad guys on foot.)
He also appeared in two episodes of “The Twilight Zone” involving time travel. In one, he tries to prevent the assassination of Abraham Lincoln; in the other, about a time machine that accidentally rescues a 19th-century murderer from a hanging, he plays the inventor, a professor.
Mr. Johnson’s survivors include his wife, Connie; a daughter, Kim; a stepson, Court Dane; and a grandson.
Ms. Louise and Ms. Wells are the only surviving “Gilligan’s Island” cast members.
After “Gilligan’s Island,” Mr. Johnson made a career guest-starring in other series, including the dramas “Mannix,” “Cannon” and “Lou Grant” and the comedies “Bosom Buddies” and “The Jeffersons,” usually as an upright character with smarts.
He also reprised the Professor role in the 1970s and 1980s in the cartoon series “The New Adventures of Gilligan” and “Gilligan’s Planet” and in three made-for-television “Gilligan” movies.
“ ‘Gunsmoke,’ ‘Wagon Train,’ ‘The Dakotas,’ you name a western, I did it,” he said of his career before “Gilligan.” He added: “I was always the bad guy in westerns. I played more bad guys than you can shake a stick at until I played the Professor. Then I couldn’t get a job being a bad guy.”
Russell Johnson was more than just the Professor. He starred in many other series, including one very chilling and frightening episode of Thriller entitled “The Hungry Glass“.
There are only two original members of the “Gilligan” cast still alive, and that is Mary Ann (Dawn Wells) and Ginger (Tina Louise).
It was always great watching the tale of seven stranded cast-a-ways, who never failed to bring a smile on my face. The Professor and his McGuyver-like inventions never failed to bring a laugh to me. Even when he could not find a way to repair their boat to leave the island, but, then again if he did such a thing, there could not have been a “Gilligan’s Island”.
Now, the Professor has left us, but, not without leaving us happy and fond memories of a unique and enduring TV series.
Chryssa, a Greek-born American sculptor who in the 1960s was one of the first people to transform neon lighting from an advertising vehicle into a fine art medium, died on Dec. 23. She was 79.
Her death, which was reported in the Greek press, was not widely publicized outside the country. Perhaps fittingly for an artist whose work centered on enigma, the place of her death could not be confirmed; the Greek news media reported that she was buried in Athens.
Chryssa, who used only her first name professionally, had lived variously in New York and Athens over the years.
A builder of large-scale assemblages in a wide range of materials — bronze, aluminum, plaster, wood, canvas, paint, found objects and, in the case of neon, light itself — Chryssa, whose work prefigured Minimalism and Pop Art, was considered a significant presence on the American art scene in the ’60s and ’70s.
Exhibited widely in the United States in those years, her art is in the collections of major museums, including the Museum of Modern Art, the Guggenheim Museum and the Whitney Museum of American Art in New York, and the Corcoran Gallery of Art in Washington.
“Large Bird Shape” Collection Albright-Knox Art Gallery, Buffalo, NY. Gift of Frank L. Gentile, 1982.
Reviewing an exhibition of Chryssa’s neon sculptures at the Pace Gallery in Manhattan in 1968, The New York Times called one work, “Study for the Gates No. 15,” “a pure, lyrical form,” adding, “It transcends ‘neon-ness’ to become a sculpture of light devoid of pop or Broadway associations.”
New York, where Chryssa first lived in the mid-1950s, furnished the literal spark for her work.
She had long been fascinated with written communication and her early work, haunting and deliberately obscure, focused on writing — in particular on fragmentary bits of text — as a medium of art.
Some of her first constructions were made of newspapers (in the New York of the period there were a great many to choose from), employing them as a sculptural medium. Others incorporated pieces of old advertising signs. Still others assumed the form of outsize letters and numbers, training the viewer’s eye on features of typographical anatomy, writ large.
Her first major piece, “Cycladic Books,” made not long after her arrival in New York, was a series of plaster panels covered with barely discernible markings, like clay tablets inscribed with an unreadable script from the ancient past.
But in a midcentury urban epiphany, Chryssa realized that neon tubing — which had been the exclusive province of sign makers — could provide the marriage of text, color and illumination she craved.
“I saw Times Square with its light and letters,” she said afterward, “and I realized it was as beautiful and difficult to do as Japanese calligraphy.”
She began incorporating neon into her work in the early ’60s and over time surmounted the fiendish technical difficulties the medium entailed.
One of her first major neon constructions, “Times Square Sky,” was completed in 1962. An assemblage of large cursive letters cast in metal, it was topped with the word “air,” written — airily — in pale blue neon.
In 1966, Chryssa completed “The Gates to Times Square,” a brightly lighted sculpture considered to be among her masterworks. Built of cast stainless steel, plexiglass and neon tubing, it takes the form of an immense cube, 10 feet on each side, through which visitors can walk.
“The Gates to Times Square” Collection Albright-Knox Art Gallery, Buffalo, NY. Gift of Mr. and Mrs. Albert A. List, 1972.
Inside, after passing through an entrance in the form of a large capital A, visitors are met with a counterpoint of symbols, text and colors.
Chryssa Vardea-Mavromichali was born in Athens on Dec. 31, 1933. She grew up amid the Nazi occupation of Greece, a time when members of the Greek underground communicated with one another by writing furtive messages on the walls of buildings.
A 1968 article about Chryssa in The New York Herald Tribune suggested that this was the wellspring of her obsession with fragmentary text.
Chryssa began her professional life as a social worker, assisting earthquake victims on the Greek island of Zakynthos. Growing disillusioned with what she saw as government intransigence, she left for Paris, where she studied art at the Académie de la Grande Chaumière and came under the influence of Surrealists like the poet André Breton and the artist Max Ernst.
Moving to the United States, she attended the California School of Fine Arts in San Francisco (now the San Francisco Art Institute) before settling in New York.
Her first solo exhibition in New York, featuring alphabetical and numerical constructions, was held at the Betty Parsons Gallery in 1961. Reviewing the exhibition in The Times, Stuart Preston commended her “clear, classical, daylight sense of order.”
That year, Chryssa’s paintings, reliefs and sculptures were featured in a solo exhibition at the Guggenheim. In later years her work was seen at the Leo Castelli Gallery in New York, the Walker Art Center in Minneapolis and elsewhere.
Chryssa, who became an American citizen, moved back to Athens in the early 1990s but later returned to New York. Information on survivors was not available.
Some critics expressed discomfort that Chryssa’s artwork, with its layers of atomized text, could not easily be interpreted. But that, she replied, was precisely the point.
As she told The Herald Tribune in 1968, “I have always felt that when things are spelled out they mean less, and when fragmented they mean more.”
Roy Campbell Jr., who carried the soulful swagger of hard-bop trumpet into the jazz avant-garde, where he became a pillar, died on Jan. 9 at his home in the Bronx. He was 61.
The cause was hypertensive atherosclerotic cardiovascular disease, said his sister, Valerie Campbell Morris, his only immediate survivor.
Mr. Campbell was a proud heir to the legacy of 1960s free jazz, as established by trailblazers like the saxophonist Albert Ayler, the pianist Cecil Taylor and the trumpeter Don Cherry, one of Mr. Campbell’s idols. Combining a pugnacious sound with an open-minded approach, Mr. Campbell worked with an array of colleagues in that lineage. He was a fixture at the Vision Festival in New York, an annual festival of avant-gardism, and recorded his most recent album, “Akhenaten Suite” (Aum Fidelity), in concert there in 2007.
As a composer and bandleader he favored strong rhythm and folkloric texture, putting those elements together in Tazz, an energetic quartet featuring piano, bass and drums, and Pyramid Trio, with the bassist William Parker and a succession of drummers. “Ethnic Stew and Brew,” a Pyramid Trio album released on Delmark in 2001, was one of Mr. Campbell’s most critically acclaimed.
For more than 20 years, off and on, he also stood front and center in Other Dimensions in Music, a ruggedly spontaneous band with Daniel Carter on reeds and flute (and sometimes trumpet), Mr. Parker on bass and Rashid Bakr on drums. He held a similar role as a member of the Nu Band, and in ensembles led by Mr. Parker, the pianist Matthew Shipp and the guitarist Marc Ribot.
Roy Sinclair Campbell Jr. was born in Los Angeles on Sept. 29, 1952, and raised from the age of 2 in the Bronx. His mother, Erna Arene Forte Campbell, worked at P.S. 21 in the Bronx; his father was a Wall Street communications specialist and a trumpeter himself. Roy Jr. began his musical training on piano and also learned flute and violin.
The trumpet became his focus during his senior year in high school, and from then on he moved quickly. Through the nonprofit music-outreach organization Jazzmobile, he studied with Lee Morgan, Kenny Dorham and Howard McGhee, assertive trumpeters from different points on the bebop spectrum. He majored in trumpet at the Borough of Manhattan Community College, where he also studied theory and composition with the esteemed multireedist Yusef Lateef, who died last month at 93.
Mr. Campbell formed his first band, Spectrum, at 20, and began playing widely as a sideman, notably with Ensemble Muntu, a fixture on New York’s 1970s loft-jazz scene. He released his debut album, “New Kingdom” (Delmark), in 1992, around the time he ended a two-year stint in the Netherlands; its opening track was “I Remember Lee,” a pledge of allegiance to Morgan. (On his following album, Mr. Campbell would include a waltz titled “Booker’s Lament,” after another influence, the trumpeter Booker Little.)
Beyond his affinities with hard bop and free jazz, Mr. Campbell worked in a range of styles including funk, hip-hop and reggae. And he was an encouraging mentor to younger trumpeters, both informally and in his capacity as a founder of the stylistically broad Festival of New Trumpet Music, which he established with Dave Douglas in 2003.
Arnold R. Pinkney and his wife, Betty, after voting in 1971 in what was the first of his two bids to become Cleveland’s mayor. United Press International
“There are always more slaves than slave masters,” the Rev. Jesse Jackson roared in a speech in Selma, Ala., early in his 1984 presidential campaign. “We can win! We got master’s degrees in disappointment and Ph.D.s in how to overcome!”
The crowd’s ecstatic response to Mr. Jackson, a prominent civil rights activist who had never held elective office, underlined one of the strengths of his effort to be considered a credible challenger for the Democratic nomination and potentially the first black president of the United States.
Another strength was a team of professionals doing the groundwork of mobilizing voters, led by Mr. Jackson’s campaign manager, Arnold R. Pinkney, who died on Monday in Cleveland at 83.
Though his efforts fell short in votes, Mr. Pinkney was instrumental in rallying minorities, the poor and the disenfranchised to Mr. Jackson’s cause. Mr. Jackson, defying expectations, emerged from a crowded field to finish third in the race for the nomination behind former Vice President Walter F. Mondale and Senator Gary Hart.
In managing the Jackson campaign, Mr. Pinkney likened his goal to one an earlier generation of political aides had set in persuading voters to see Dwight D. Eisenhower as not only the victorious Army commander of World War II but also a potential president who could manage the government in peacetime.
“Somebody sold him as a politician,” Mr. Pinkney told The Los Angeles Times. “Our job is to make that transition for Jackson.”
Mr. Pinkney brought to the campaign a seasoned understanding of both political success and failure. In 1967 he worked to elect Carl Stokes the first black mayor of a large American city, Cleveland (a job Mr. Pinkney himself later sought twice). The next year he managed the successful campaign of Mr. Stokes’s brother, Louis, to become the first black member of Congress from Ohio. Mr. Pinkney helped run many campaigns of both black and white politicians, including President Jimmy Carter’s unsuccessful re-election bid in 1980.
It was hearing the speech of a white politician on the radio in 1948 when he was a teenager that sparked Mr. Pinkney’s devotion to politics, prompting him to discard his parents’ affection for Republicans in favor of Democrats.
In the speech, Hubert H. Humphrey, then the young mayor of Minneapolis and a rising star in national politics, was imploring delegates at the Democratic National Convention in Philadelphia to endorse equal rights for blacks. Mr. Humphrey — who went on to become a United States senator of Minnesota and Lyndon B. Johnson’s vice president — delivered the call with such vigor that Southern segregationists stalked out of the hall, and the party.
Twenty-four years later, Mr. Pinkney was Mr. Humphrey’s deputy campaign manager in a race to win the Democratic presidential nomination for the second time and unseat the man who had defeated Mr. Humphrey four years earlier, Richard M. Nixon.
The relationship between Mr. Humphrey and Mr. Pinkney deepened as they traveled from primary to primary, and after Mr. Humphrey, then a senator again, lost the California primary, he publicly promised that if his campaign revived and he won the general election, he would bring Mr. Pinkney into his administration.
“It was the greatest moment of my life,” Mr. Pinkney told The Plain Dealer in Cleveland in 1997, notwithstanding that Mr. Humphrey went on to lose the nomination to Senator George S. McGovern.
Last week, Mr. Jackson said of Mr. Pinkney, “With his passing, a huge part of history goes with him.”
He was born in Youngstown, Ohio, on Jan. 6, 1931. His father died three months before he finished high school, so he worked in steel mills to help his family make ends meet.
He graduated from Albion College in Michigan, where he won letters in football, track, baseball and basketball. During a stint in the Army, he played baseball with major leaguers. Paul O’Dea, a scout for the Cleveland Indians, told him that he had a shot at making the big leagues by his late 20s, but advised him to go to law school instead. “Your race needs more lawyers than baseball players,” Mr. Pinkney recalled Mr. O’Dea saying.
He took the advice and attended what is now Case Western Reserve University School of Law, but he dropped out for financial reasons. He then became one of the first black agents hired by the Prudential Insurance Company of America and later opened a successful insurance agency. As a civil rights activist, he led a membership drive for the N.A.A.C.P. and joined the picketing of a Cleveland supermarket that had refused to hire blacks.
He began his political career by helping out on local campaigns for judges, then volunteered for Carl Stokes’s mayoral campaign. Louis Stokes tapped him to be his paid campaign manager in 1968. Mr. Pinkney was later president of the Cleveland Board of Education and twice sought the city’s mayoralty, losing in a three-man race in 1971 and again in 1975. After the second defeat, he moved to Shaker Heights, a Cleveland suburb.
Mr. Jackson said he had chosen Mr. Pinkney to run his 1984 campaign because he was a “voice of pragmatism” and because of his experience with national campaigns. When he took over, Mr. Pinkney set about righting a campaign that was in disarray: Field offices had not been set up, phones were not being answered, and Mr. Jackson was often showing up late for appearances.
But he also had to contend with problems outside his control. The Jackson forces were buoyed when Louis Farrakhan, the leader of the black separatist Nation of Islam, announced his support. But then a recording surfaced in which Mr. Farrakhan made remarks widely interpreted as anti-Semitic. Amid a storm of outrage, Mr. Pinkney helped draft a statement calling Mr. Farrakhan’s words “reprehensible.”
Another challenge came when Mr. Jackson made diplomatic forays to Latin America, meeting with Nicaragua’s leftist leaders and leftist rebels in El Salvador to try to steer them toward peace and winning the release of more than 20 political prisoners in Cuba. Mr. Pinkney worried that with the race for the nomination nearing its end, Mr. Jackson was absent, squandering a chance to attract maximum attention on domestic issues at a crucial time. In his absence, The Washington Post said, Mr. Pinkney would “look after his interests.”
Mr. Jackson unsuccessfully sought the nomination again in 1988, with Gerald F. Austin as his campaign manager.
Mr. Pinkney’s death was announced by his family. His survivors include his wife, Betty, and their daughter, Traci.
Mr. Pinkney liked to say the changes in America that led to Barack Obama’s election as president in 2008 began with Carl Stokes’s victory in Cleveland four decades earlier. On the night of Mr. Obama’s victory, Mr. Pinkney told a crowd of celebrators that blacks could no longer justifiably refuse to fight in foreign wars for a country that treated them as second-class citizens.
“This wipes all that out,” he said. “No one can accuse the country of that again. It’s a magnificent night.”
Franklin McCain in 2010. Lynn Hey/Associated Press
Franklin McCain, who helped fuel the civil rights movement in 1960 when he and three friends from their all-black college requested, and were refused, coffee and doughnuts at a whites-only lunch counter in Greensboro, N.C., died on Thursday in Greensboro. He was 73.
The cause was respiratory complications, his son Franklin Jr. said.
Mr. McCain was one of the so-called Greensboro Four, who sat down at lunch counter stools at an F. W. Woolworth store on Feb. 1, 1960, fully expecting that they would not be served. When they were not, they came back the next day, and the next, and the next.
As word of the protest spread, others, in ever-growing numbers, joined them. By the end of the fifth day, more than a thousand had arrived. And on July 25, the store relented and made the lunch counter available to all.
It was not the first such sit-in. After the Supreme Court’s order to desegregate the public schools in 1954, activists tried to integrate lunch counters in Oklahoma City, Baltimore and other cities on the periphery of the segregated South. There had been similar efforts in the Deep South, particularly in Orangeburg, S.C., in 1955 and ’56 and in Durham, N.C., in 1957.
Mr. McCain, second from left, at a sit-in at a Woolworth’s lunch counter in Greensboro, N.C., in 1960. United Press International
But the Greensboro episode, by most estimations, had the widest impact, inviting national publicity and inspiring a heightened level of activism among college students and other youths. Later that year, the Student Nonviolent Coordinating Committee, one of the most effective civil rights groups, was born in Southern black colleges.
Others soon imitated the Greensboro campaign in more than 55 cities and towns in 13 states. Only some were successful, but their cumulative effect was to contribute to the momentum that led to the Civil Rights Act of 1964, which banned segregated restaurants with interstate operations, as Woolworth had.
The Woolworth sit-in could be traced to the fall of 1959, when Mr. McCain and three other freshmen at the Agricultural and Technical College of North Carolina in Greensboro would get together to bat around issues of the day as “elementary philosophers,” as Mr. McCain put it in an interview for “My Soul Is Rested,” a 1977 oral history of the civil rights movement by Howell Raines, a former executive editor of The New York Times.
Mr. McCain said a large question kept arising in their late-night sessions: “At what point does a moral man act against injustice?”
On Sunday night, Jan. 31, 1960, they decided to act. Bolstering one another’s courage, they resolved that they would sit down on lunch-counter stools the next day and stay there until they were served.
“Well, you know, that might be weeks, that might be months, that might be never,” one of the four, Ezell Blair Jr., now known as Jibreel Khazan, recalled saying. The other two students were Joseph McNeil and David Richmond. Mr. Richmond died in 1990.
Yolande Betbeze Fox protesting segregationist store policies in New York in June 1960. Neal Boenzi/The New York Times
The next afternoon, they walked a mile to the Woolworth at Elm and Market Streets, arriving about 3:20. They bought some school supplies and waited for their receipts as proof of purchase. They later recalled chafing at how eagerly the store had taken their money for merchandise while refusing it at the lunch counter, directing them instead to a basement hot dog stand.
“We wonder why you invite us in to serve us at one counter and deny service at another,” Mr. McCain recalled saying. “If this is a private club or private concern, then we believe you ought to sell membership cards and sell only to persons who have a membership card. If we don’t have a card, then we’d know pretty well that we shouldn’t come in or even attempt to come in.”
That, he recounted, “didn’t go over too well.” But as he sat waiting for a doughnut that he knew would never come, Mr. McCain felt oddly empowered.
“The best feeling of my life,” he said in an interview with The Associated Press in 2010, was “sitting on that dumb stool.”
“I felt so relieved,” he continued. “Nothing has ever happened to me before or since that topped that good feeling of being clean and fully accepted and feeling proud of me.”
Mr. McCain described the scene to Mr. Raines. A police officer paced, patting a club in his hand, but without provocation he seemed powerless to act. A black dishwasher derided “the rabble-rousers” as potentially hurting black people. Some whites uttered racial epithets, but others whispered encouragement.
Students held a sit-in at a lunch counter for a sixth day of protests at Woolworth’s. United Press International.
One white woman said that she was proud of the young men and that she wished they had acted 10 years earlier. At that moment, Mr. McCain later said, he discarded any concept he had of racial stereotypes.
Franklin Eugene McCain was born on Jan. 3, 1941, in Union County, N.C., and raised in Washington. In a biography prepared for the PBS documentary “February One” in 2010, Mr. McCain said he had grown up being taught what he called “the big lie” — that if he behaved and studied hard, all opportunities would be open to him.
At North Carolina A&T, he earned a degree in chemistry and biology. He went on to work as a chemist and sales representative for the Celanese Corporation for nearly 35 years. He was active in civil rights organizations and served on the boards of his alma mater; his wife’s alma mater, Bennett College, a historically black college for women in Greensboro; and the governing body for the 17-campus University of North Carolina system.
His wife, the former Bettye Davis, died in 2013. In addition to his son Franklin Jr., his survivors include two other sons, Wendell and Bert, and six grandchildren.
Years earlier, on Feb. 1, 1980, all of the Greensboro Four returned for a re-enactment of their historic action. A black vice president of Woolworth was there to serve them. Because of the flurry of celebration and the crush of reporters, the guests of honor never got to eat.
“Twenty years ago I could not get served here,” Mr. McCain said. “I come back today and I still can’t get served.”
The Rev. Vincent J. Termine, a Roman Catholic priest who helped revive a dying parish in Bensonhurst, Brooklyn, in the 1960s and ’70s, then angered members of his predominantly white flock when he let black youths from another neighborhood participate in organized basketball at the church, died on Dec. 26 in Johns Island, S.C., where he had lived since 2009. He was 93.
His death was confirmed by his brother, John.
Father Termine’s parish, Most Precious Blood, was in serious disrepair and its elementary school had just lost its teaching order of nuns when he was transferred there in 1967. Over the next decade he led a drive to raise more than a million dollars to rebuild the church. He also recruited a new teaching order of nuns for the school.
The racial tensions flared in the midst of those efforts, in the mid-1970s, after Gerard Papa, a community-minded Brooklyn lawyer, organized a basketball league, known as the Flames, to bring together Italian-American boys from the suburban-style homes of south Brooklyn and blacks and Hispanics from the projects.
Rev. Vincent J. Termine
Father Termine (pronounced TER-mine) agreed to let them use his church as their home base so that they would qualify to compete in Catholic Youth Organization tournaments. (Lacking a basketball court, the church offered its bingo hall for use as one.)
The decision angered many parishioners. At the Flames’ first practice session, bat-wielding white toughs menaced the black players whom Father Termine had welcomed to his church, on Bay 47th Street.
By his account, Father Termine resolved the dispute by going to a Brooklyn social club, where he knew he could find the father of one of the bat-wielding toughs — “a local, ah, man of respect,” as he described him to Robert Lipsyte, then a columnist for The New York Times, in 1994.
“He stormed into the back room,” Mr. Lipsyte wrote, relating Father Termine’s account. “Cards and chips flew as he roared — (‘I can be dramatic when necessary’) — about Jesus and justice.”
Father Termine said the neighborhood man gave him his personal pledge of safe-conduct for the Flames. There were no further incidents.
The Flames were one of the few racially integrated basketball squads in the city while they played in C.Y.O. games under the banner of Most Precious Blood Church from the mid-’70s until the late ’90s. In 1997, two years after Father Termine retired, C.Y.O. officials barred them from competition after the new pastor at Precious Blood declined to continue sponsoring them, saying he had a new ‘‘vision’’ for his church’s sports program.
Mr. Papa contested the decision in a civil suit, but was unsuccessful.
Vincent Joseph Termine was born in Brooklyn on March 4, 1920, one of five children of Charles Termine, a bus dispatcher, and his wife, Mary. He lived and worked in Brooklyn almost all his life.
After graduating magna cum laude from St. John’s University and receiving his religious training at Immaculate Conception Seminary in Huntington, on Long Island, Mr. Termine was ordained in 1944 and began the first of many pastoral assignments in Brooklyn.
He served as assistant pastor at the Church of St. Michael-St. Edward in Fort Greene, as chaplain at the old Raymond Street Jail, and at the Cumberland Hospital. He was assigned to St. Rocco’s Church in Park Slope, Saint Mark Church in Sheepshead Bay, the tuberculosis hospital in Manhattan Beach; St. Blaise Church in Crown Heights (now the Church of St. Francis of Assisi and St. Blaise), and, as Catholic chaplain, to the city Sanitation Department’s Brooklyn operations.
The refurbished Most Precious Blood Church reopened in 1976, and weekly attendance began to grow.
Father Termine remained a resident of Brooklyn — he had a home in Coney Island — until he moved to South Carolina to be closer to family members. In addition to his brother John, a physician and medical researcher, he is survived by another brother, Charles Termine, a surgeon.
Father Termine was something of throwback to another era. He “seems to have come from old black-and-white movies,” Mr. Lipsyte wrote in his 1994 column, “a burly 74-year-old with Bible stories and wicked winks and knobby hands that have snatched away dice in back alleys and paddled whole classrooms and hacked through teenage rumbles.
“He started clubs for kids, and when they didn’t show up he stomped into pool rooms and candy stores and dragged them to the church,” Mr. Lipsyte wrote. “He bought Ping-Pong tables with his own money. He gave confession in parked cars, bars and once in the freezer of a butcher shop. His mother had advised him to ‘Nag, nag, nag,’ because sooner or later the kids would remember the message, and that someone cared enough to keep delivering it to them. Father Termine still thinks it is good advice.”
On Dec. 29, 1890, United States cavalry, in the last battle of the American Indian wars, massacred as many as 350 Lakota Sioux at Wounded Knee in South Dakota. Three generations later, Carter Camp, a 32-year-old Indian militant, retaliated.
On the night of Feb. 27, 1973, he led the first wave of armed, self-styled warriors in an operation to seize Wounded Knee, which had become a town on the Pine Ridge Indian Reservation. The invaders, carrying a list of grievances against the federal government, seized the trading post, cut the telephone lines, ran the Bureau of Indian Affairs police out of town and took 11 hostages.
“We were pretty sure that we were going to have to give up our lives,” Mr. Camp said in an interview for the PBS program “American Experience” in 2009.
A caravan of 200 cars carrying Indians and their supporters followed, beginning a 71-day, gunshot-punctuated standoff that some applauded as a show of new assertiveness by long-downtrodden Indians and that others deplored as criminal.
Carter Camp, center, in Wounded Knee, S.D., during a 71-day standoff in 1973. William Kunstler, the radical lawyer, is at left. Associated Press
By the time it was over, two Indians had been shot to death and a federal marshal was paralyzed. He later died. Mr. Camp was convicted of abducting, confining and beating four postal inspectors during the siege and served three years in prison.
He went on to spend decades fighting for Indian rights and died at 72 on Dec. 27 in White Eagle, Okla., the headquarters of the Ponca tribe, of which he was a member. The cause was kidney and liver cancer, his brother Craig said.
Carter Camp’s dream was to regain the vast lands his people had lost through unfair and broken treaties. But he started by aiming his sights lower, leading a campaign in 1970 to change the way federal money for Indian education was allocated on the Ponca reservation. He became state leader of the American Indian Movement, or AIM, which was organized in 1968 in Minneapolis as a defender of American Indian sovereignty. In 1972, he helped lead an AIM caravan from the West Coast to Washington, where “red power” advocates occupied the Bureau of Indian Affairs building.
During the Wounded Knee occupation the next year, alongside the AIM leaders Dennis Banks and Russell C. Means, Mr. Camp was the spokesman who presented the group’s demands to the government, among them that the government honor 371 broken treaties and that it end what the group called corrupt tribal governments. Mr. Camp rejected an offer of leniency if the protesters left immediately.
“We decided that the Indian people were more important to us than jail terms,” he was quoted as saying in “The Road to Wounded Knee” (1974), by Robert Burnette and John Koster.
When the Indians finally did end their occupation, Mr. Camp was one of the leaders who signed the agreement. Mr. Banks did not.
In August 1973, Mr. Camp was elected chairman of AIM but within weeks was ejected from the organization after being accused of shooting another AIM leader, Clyde Bellecourt, in the stomach. News accounts and histories say Mr. Camp was angry that Mr. Bellecourt had accused him of being a paid informer for the F.B.I. Charges were dropped after Mr. Bellecourt and a witness refused to testify.
The episode precipitated swirls of speculation. In his 1983 book, “In the Spirit of Crazy Horse,” Peter Matthiessen said Mr. Camp had called Mr. Bellecourt a coward because he refused to carry a gun. Others suggested that the F.B.I. had planted the rumor that Mr. Camp was an informer to damage AIM’s credibility.
Bruce E. Johansen, the author of “Encyclopedia of the American Indian Movement” (2013), wrote that Mr. Bellecourt had tried to salvage Mr. Camp’s reputation but that Mr. Means had insisted he be expelled.
Carter Augustus Camp was born in Pawnee, Okla., on Aug. 18, 1941. He graduated from Haskell Institute, a high school for Indians in Lawrence, Kan. (It became Haskell Indian Nations University.) He then joined the Army and served in Western Europe. After his discharge, he worked in a factory in Los Angeles, serving as shop steward for the electrical workers’ union.
Mr. Camp returned to Oklahoma to be close to his roots, literally. “We believe the soil and every plant contains the dust of our ancestors,” he once said.
In recent years, Mr. Camp had fought against garbage companies’ using Indian lands for disposal; a proposed pipeline to bring tar sands oil from Canada; and a bar catering to motorcyclists near his reservation. He protested a re-enactment of the Lewis and Clark expedition, calling it a remembrance of the extermination of his people.
In addition to his brother, Mr. Camp is survived by another brother, Dwain; his wife, Linda; his sons Kenny, Jeremy, Victorio, Mazhonaposhe and Augustus; his sister, Casey Camp-Horinek; 24 grandchildren; and a great-granddaughter.
Mr. Camp helped organize annual sun dances conducted by Leonard Crow Dog, the spiritual leader of the Wounded Knee occupiers. Participants, who may not eat or drink, dance around a cottonwood tree from sunrise to sunset.
RUN RUN SHAW, CHINESE-MOVIE GIANT OF THE KUNG FU GENRE
By JONATHAN KANDELL
JAN. 6, 2014
Run Run Shaw in 1978 with his wife and daughter. Mr. Shaw and his older brother were movie pioneers in Asia, producing and sometimes directing films and owning cinema chains. Central Press, via Getty Images
Run Run Shaw, the colorful Hong Kong media mogul whose name was synonymous with low-budget Chinese action and horror films — and especially with the wildly successful kung fu genre, which he is largely credited with inventing — died on Tuesday at his home in Hong Kong. He was 106.
His company, Television Broadcasts Limited, announced his death in a statement.
Born in China, Mr. Shaw and his older brother, Run Me, were movie pioneers in Asia, producing and sometimes directing films and owning lucrative cinema chains. His companies are believed to have released more than 800 films worldwide.
After his brother’s death in 1985, Mr. Shaw expanded his interest in television and became a publishing and real estate magnate as well. For his philanthropy, much of it going to educational and medical causes, he was knighted by Queen Elizabeth II and showered with public expressions of gratitude by the Communist authorities in Beijing.
Mr. Shaw enjoyed the zany glamour of the Asian media world he helped create. He presided over his companies from a garish Art Deco palace in Hong Kong, a cross between a Hollywood mansion and a Hans Christian Andersen cookie castle. Well into his 90s he attended social gatherings with a movie actress on each arm. And he liked to be photographed in a tai chi exercise pose, wearing the black gown of a traditional mandarin.
Asked what his favorite films were, Mr. Shaw, a billionaire, once replied, “I particularly like movies that make money.”
Run Run Shaw was born Shao Yifu in Ningbo, Zhejiang Province, on Nov. 23, 1907. As a child, he moved to Shanghai, where his father ran a profitable textile business. According to some Hong Kong news media accounts, Run Run and Run Me were English-sounding nicknames the father gave his sons as part of a family joke that played on the similarity of the family name to the word rickshaw.
Evincing little interest in the family business, Run Run and Run Me turned instead to entertainment. The first play they produced was called “Man From Shensi,” on a stage, as it turned out, of rotten planks. As the brothers often told the story, on opening night the lead actor plunged through the planks, and the audience laughed. The Shaws took note and rewrote the script to include the incident as a stunt. They had a hit, and in 1924 they turned it into their first film.
After producing several more movies, the brothers decided that their homeland, torn by fighting between Nationalists and Communists, was too unstable. In 1927 they moved to Singapore, which was then part of British colonial Malaya.
Besides producing their own films in Singapore, the brothers imported foreign movies and built up a string of theaters. Their business boomed until the Japanese invaded the Malay Peninsula in 1941 and stripped their theaters and confiscated their film equipment. But according to Run Run Shaw, he and his brother buried more than $4 million in gold, jewelry and currency in their backyard, which they dug up after World War II and used to resume their careers.
With the rise of Hong Kong as the primary market for Chinese films, Run Run Shaw moved there in 1959, while his brother stayed behind looking after their Singapore business.
In Hong Kong, Run Run Shaw created Shaw Movietown, a complex of studios and residential towers where his actors worked and lived. Until then, the local industry had turned out 60-minute films with budgets that rarely exceeded a few thousand dollars. Shaw productions ran up to two hours and cost as much as $50,000 — a lavish sum by Asian standards at the time.
Mr. Shaw went on to plumb the so-called dragon-lady genre with great commercial success. Movies like “Madame White Snake” (1963) and “The Lady General” (1965) offered sexy, combative, sometimes villainous heroines, loosely based on historical characters. And by the end of the 1960s, he had discovered that martial-arts films in modern settings could make even more money.
His “Five Fingers of Death” (1973), considered a kung fu classic, was followed by “Man of Iron” (1973), “The Shaolin Avengers” (1976) and many others. Critics dismissed the films as artless and one-dimensional, but spectators crowded into the theaters to cheer, laugh or mockingly hiss at the action scenes. To ensure that his films were amply distributed, Mr. Shaw’s chain of cinemas grew to more than 200 houses in Asia and the United States. “We were like the Hollywood of the 1930s,” he said. “We controlled everything: the talent, the production, the distribution and the exhibition.”
Other Hong Kong producers, directors and actors called Mr. Shaw’s methods iron-fisted. In 1970, Raymond Chow, a producer with Mr. Shaw’s company, Shaw Brothers, left to form his own company, Golden Harvest, which gave more creative and financial independence to top directors and stars.
Mr. Chow’s biggest success, and Mr. Shaw’s most notable loss, was his decision to bankroll Bruce Lee. Mr. Lee initially approached Shaw Brothers, which turned down his demand for a long-term contract of $10,000 per film. Golden Harvest then offered Mr. Lee creative control and profit-sharing.
“The Big Boss,” better known as “Fists of Fury” (1971), was Mr. Lee’s first film with Golden Harvest, and it broke all Hong Kong box-office records. Other big-name actors and directors flocked to Golden Harvest, breaking Shaw Brothers’ virtual monopoly.
But Run Run Shaw had already expanded beyond the film industry. His investments in the new phenomenon of Asian television were to prove even more lucrative than his movie productions. In 1972 he began Television Broadcasts (TVB), and he soon gained control of 80 percent of the Hong Kong market. TVB churned out 12 hours of its own programming a day, much of it soap operas and costume dramas that riveted Chinese television viewers on the mainland and throughout Southeast Asia.
As his fortune grew, Mr. Shaw donated generously to hospitals, orphanages and colleges in Hong Kong, for which he was named a Commander of the Order of the British Empire in 1974 and awarded a knighthood in 1977. In 1990 he donated 10 million pounds to help establish the Run Run Shaw Institute of Chinese Affairs at Oxford University, where his four children had studied. In 2004 he established the Shaw Prize, an international award for research in astronomy, mathematics and medicine.
As Hong Kong’s days as a British colony dwindled, Mr. Shaw stepped up his philanthropy in China. He contributed more than $100 million to scores of universities on the mainland and raised money in support of Chinese victims of floods and other natural disasters. Chinese leaders toasted him for his generosity at banquets in Beijing.
Mr. Shaw’s philanthropy did not extend to the United States, but he was once viewed as a white knight in New York. In 1991, when Macy’s was on the verge of bankruptcy, he bought 10 percent of its preferred shares for $50 million, becoming one of the largest shareholders in R. H. Macy & Company.
The investment had a personal aspect. Ten years earlier, Mitchell Finkelstein, the son of Macy’s chief executive, Edward S. Finkelstein, had married Hui Ling, a Shaw protégée who appeared in many of his movies. Mr. Shaw met the older Finkelstein at the wedding, and they became friends.
In later years, the aging mogul himself seemed in need of help to keep his media empire intact. Concerned with the rise of cable and satellite television, he sold a 22 percent stake in TVB to Rupert Murdoch’s News Corporation in 1993.
Mr. Shaw had intended to maintain control over his media business by balancing his one-third share in TVB against Mr. Murdoch’s 22 percent and the 24 percent held by Robert Kuok, one of Hong Kong’s richest entrepreneurs. But the balance of power shifted when Mr. Murdoch sold his equity to Mr. Kuok shortly afterward. Then, in 1996, in Hong Kong’s first case of a hostile takeover, Mr. Kuok forced Mr. Shaw to sell him his shares in TVE, the lucrative publishing, music and real estate subsidiary of TVB. The deal reduced Mr. Shaw’s TVB stake to 23 percent.
Mr. Shaw’s business situation was also hindered by his inability to groom credible successors. His sons, Vee Meng and Harold, were at one time heavily involved in the family enterprises, but their relationship with him had become strained.
Mr. Shaw’s first wife, Wong Mee Chun, died in 1987. He married Mona Fong, a former singer and actress, in 1997. She survives him. Other survivors include his sons and two daughters, Dorothy and Violet, also from his first marriage.
Even after turning 90, Mr. Shaw maintained a powerful presence in the Hong Kong film world through his control of Shaw Studios. But a newer generation of independent producers came to dominate the Hong Kong market with their own violent brand of police and gangster films.
Halton C. Arp in 2005. His dogged promotion of an unorthodox theory led to exile from his peers. Jean-Pierre Jans
Halton C. Arp, a provocative son of American astronomy whose dogged insistence that astronomers had misread the distances to quasars cast doubt on the Big Bang theory of the universe and led to his exile from his peers and the telescopes he loved, died on Dec. 28 in Munich. He was 86.
The cause was pneumonia, said his daughter Kristana Arp, who said he also had Parkinson’s disease.
As a staff astronomer for 29 years at Hale Observatories, which included the Mount Wilson and Palomar Mountain observatories in Southern California, Dr. Arp was part of their most romantic era, when astronomers were peeling back the sky and making discovery after discovery that laid the foundation for the modern understanding of the expansion of the universe.
But Dr. Arp, an artist’s son with a swashbuckling air, was no friend of orthodoxy. A skilled observer with regular access to a 200-inch telescope on Palomar Mountain, he sought out unusual galaxies and collected them in “The Atlas of Peculiar Galaxies” (1966), showing them interacting and merging with loops, swirls and streamers that revealed the diversity and beauty of nature.
But these galaxies also revealed something puzzling and controversial. In the expanding universe, as discovered by Edwin Hubble in 1929, everything is moving away from us. The farther away it is, the faster it is going, as revealed by its redshift, a stretching of light waves — like the changing tone of an ambulance siren as it goes past — known as a Doppler shift.
Dr. Arp found that galaxies with radically different redshifts, and thus at vastly different distances from us, often appeared connected by filaments and bridges of gas. This suggested, he said, that redshift was not always an indication of distance but could be caused by other, unknown physics.
The biggest redshifts belonged to quasars — brilliant, pointlike objects that are presumably at the edge of the universe. Dr. Arp found, however, that they were often suspiciously close in the sky to relatively nearby spiral galaxies. This suggested to him that quasars were not so far away after all, and that they might have shot out of the nearby galaxies.
If he was right, the whole picture of cosmic evolution given by the Big Bang — of a universe that began in a blaze of fire and gas 14 billion years ago and slowly condensed into stars, galaxies and creatures over the eons — would have to go out the window.
A vast majority of astronomers dismissed Dr. Arp’s results as coincidences or optical illusions. But his data appealed to a small, articulate band of astronomers who supported a rival theory of the universe called Steady State and had criticized the Big Bang over the decades. Among them were Fred Hoyle of Cambridge University, who had invented the theory, and Geoffrey Burbidge, a witty and acerbic astrophysicist at the University of California, San Diego. Dr. Arp survived both of them.
“When he died, he took a whole cosmology with him,” said Barry F. Madore, a senior research associate at the Carnegie Observatories in Pasadena, Calif.
Halton Christian Arp was born on March 21, 1927, in New York City, the only son of August and Anita Arp. His father was an artist and his mother ran institutions for children and adolescents. Halton grew up in Greenwich Village and various art colonies and did not go to school until fifth grade. After bouncing around public schools in New York, he was sent to Tabor Academy, on Buzzards Bay in Massachusetts, a prep school for the United States Naval Academy.
After a year in the Navy, he attended Harvard, where he majored in astronomy. He graduated in 1949 and went on to obtain a Ph.D. in 1953 at the California Institute of Technology, which had started an astronomy graduate program to prepare for the advent of the 200-inch telescope.
At Harvard, he became one of the best fencers in the United States, ultimately competing in world championship matches in Paris in 1965. Cutting a dashing figure, he would adopt a fencer’s posture when giving talks. “He would strut across the stage and then strut back, as if he were dueling,” Dr. Madore said.
Dr. Arp married three times. He is survived by his third wife, Marie-Helene Arp, an astronomer in Munich; four daughters, Kristana, Alissa, Andrice and Delina Arp; and five grandchildren.
Dr. Arp became a staff astronomer at the Hale Observatories after stints as a postdoctoral fellow at the Carnegie Institution for Science and Indiana University. His breakthrough occurred, as he recalled, on a rainy night at Palomar in 1966, when he decided to investigate a chance remark by a colleague that many of his peculiar galaxies had radio sources near them in the sky. Looking them up in the Palomar library, he realized that many of those radio sources were quasars that could have been shot out of a nearby galaxy, an idea first explored by the Armenian astronomer Victor Ambartsumian a decade earlier.
“It is with reluctance that I come to the conclusion that the redshifts of some extragalactic objects are not due entirely to velocity causes,” Dr. Arp wrote in a paper a year later.
He combed the sky for more evidence that redshifts were not ironclad indicators of cosmic distance, knowing that he was striking at the heart of modern cosmology. He turned out to be an expert at finding quasars in suspicious places, tucked under the arm of a galaxy or at the end of a tendril of gas.
One of the most impressive was a quasarlike object known as Markarian 205, which had a redshift corresponding to a distance of about a billion light years but appeared to be in front of a galaxy only 70 million light years away.
The redshift controversy came to a boil in 1972, when Dr. Arp engaged in a debate, arranged by the American Association for the Advancement of Science, with John N. Bahcall, a young physicist at the Institute for Advanced Study. Timothy Ferris described the event in his book “The Red Limit” (1977): “When the debate was over, it was difficult not to be impressed with Arp’s sincerity and his love for the mysterious galaxies he studied, but it was also difficult to feel that his case had suffered anything short of demolition.”
As Dr. Arp’s colleagues lost patience with his quest, he was no longer invited to speak at major conferences, and his observing time on the mighty 200-inch telescope began to dry up. Warned in the early 1980s that his research program was unproductive, he refused to change course. Finally, he refused to submit a proposal at all on the grounds that everyone knew what he was doing. He got no time at all.
Dr. Arp took early retirement and joined the Max Planck Institute for Astrophysics near Munich, where he continued to promote his theories. He told his own side of the redshift story in a 1989 book, “Quasars, Redshifts and Controversies.”
AMIRI BARAKA, POLARIZING POET-FOUNDER OF BLACK ARTS MOVEMENT
January 9, 2014 | 03:49PM PT
Amiri Baraka, the militant man of letters and tireless agitator whose blues-based, fist-shaking poems, plays and criticism made him a provocative and groundbreaking force in American culture, died Thursday at Newark Beth Israel Medical Center in Newark, N.J. He was 79 and had been hospitalized since last month.First published in the 1950s, Baraka crashed the literary party in 1964, at the Cherry Lane Theater in Greenwich Village, when “Dutchman” opened and made instant history at the height of the civil rights movement. The play, published when Baraka was still known as LeRoi Jones, was a one-act showdown between a middle class black man, Clay, and a sexually daring white woman, Lula, ending in a brawl of murderous taunts and confessions.
“Charlie Parker. All the hip white boys scream for Bird,” Clay says. “And they sit there talking about the tortured genius of Charlie Parker. Bird would’ve not played a note of music if he just walked up to East 67th Street and killed the first 10 white people he saw. Not a note!”
“From Amiri Baraka, I learned that all art is political, although I don’t write political plays,” the Pulitzer Prize-winning dramatist August Wilson once said.
Perhaps no writer of the 1960s and ’70s was more radical or polarizing than the former LeRoi Jones, and no one did more to extend the political debates of the civil rights era to the world of the arts. He inspired at least one generation of poets, playwrights and musicians, and his immersion in spoken word traditions and raw street language anticipated rap, hip-hop and slam poetry. The FBI feared him to the point of flattery, identifying Baraka as “the person who will probably emerge as the leader of the Pan-African movement in the United States.”
Baraka transformed from the rare black to join the Beat caravan of Allen Ginsberg and Jack Kerouac to leader of the Black Arts Movement, an ally of the Black Power movement that rejected the liberal optimism of the early ’60s and intensified a divide over how and whether the black artist should take on social issues. Scorning art for art’s sake and the pursuit of black-white unity, Baraka was part of a philosophy that called for the teaching of black art and history and producing works that bluntly called for revolution.
“We want ‘poems that kill,’” Baraka wrote in his landmark “Black Art,” a manifesto published in 1965, the year he helped found the Black Arts Movement. “Assassin poems. Poems that shoot guns/Poems that wrestle cops into alleys/and take their weapons leaving them dead/with tongues pulled out and sent to Ireland.”
He was as eclectic as he was prolific: His influences ranged from Ray Bradbury and Mao Zedong to Ginsberg and John Coltrane. Baraka wrote poems, short stories, novels, essays, plays, musical and cultural criticism and jazz operas. His 1963 book “Blues People” has been called the first major history of black music to be written by an African-American. A line from his poem “Black People!” — “Up against the wall mother f—–” — became a counterculture slogan for everyone from student protesters to the rock band Jefferson Airplane. A 2002 poem he wrote alleging that some Israelis had advance knowledge of the Sept. 11 attacks led to widespread outrage.
He was denounced by critics as buffoonish, homophobic, anti-Semitic, a demagogue. He was called by others a genius, a prophet, the Malcolm X of literature. Eldridge Cleaver hailed him as the bard of the “funky facts.” Ishmael Reed credited the Black Arts Movement for encouraging artists of all backgrounds and enabling the rise of multiculturalism. The scholar Arnold Rampersad placed him alongside Frederick Douglass and Richard Wright in the pantheon of black cultural influences.
The Cuban revolution, the assassination in 1965 of Malcolm X and the Newark riots of 1967, when the poet was jailed and photographed looking dazed and bloodied, radicalized him. Jones left his white wife (Hettie Cohen), cut off his white friends and moved from Greenwich Village to Harlem. He renamed himself Imamu Ameer Baraka, “spiritual leader blessed prince,” and dismissed the Rev. Martin Luther King Jr. as a “brainwashed Negro.” He helped organize the 1972 National Black Political Convention and founded the Congress of African People. He also founded community groups in Harlem and Newark, the hometown to which he eventually returned.
The Black Arts Movement was essentially over by the mid-1970s, and Baraka distanced himself from some of his harsher comments — about Dr. King, about gays and about whites in general. But he kept making news. In the early 1990s, as Spike Lee was filming a biography of Malcolm X, Baraka ridiculed the director as “a petit bourgeois Negro” unworthy of his subject. In 2002, respected enough to be named New Jersey’s poet laureate, he shocked again with “Somebody Blew Up America,” a Sept. 11 poem with a jarring twist.
“Who knew the World Trade Center was gonna get bombed,” read a line from the poem. “Who told 4,000 Israeli workers at the Twin Towers to stay home that day?”
Then-Gov. James E. McGreevey and others demanded his resignation. Baraka refused, denying that “Somebody Blew Up” was anti-Semitic (the poem also attacks Hitler and the Holocaust) and condemning the “dishonest, consciously distorted and insulting non-interpretation of my poem.”
Juanita Moore, who earned an Academy Award nomination in 1960 for the single major film role she ever landed, then fell through the cracks of a Hollywood system with little to offer a black actress besides small parts as maids and nannies, died on Tuesday in Los Angeles. She was 99.
Associated Press
Juanita Moore
Her death was confirmed by her grandson, Kirk Kelleykahn, an actor and dancer.
Ms. Moore received a best supporting actress nomination for her role in the 1959 film “Imitation of Life,” in which she played opposite Lana Turner in a story about two single mothers, one black and one white. It was only the fifth time an African-American performer had been nominated for an Oscar.
The two women begin ostensibly as social equals living under the same roof, but their lives diverge along racial and class lines. Ms. Turner’s character becomes a famous actress; Annie Johnson, played by Ms. Moore, becomes her housemaid.
The last film that the filmmaker Douglas Sirk directed in Hollywood, “Imitation of Life” was widely dismissed as campy melodrama at the time. Its treatment of the intense suffering caused by racial bias, including a subplot in which Annie’s light-skinned daughter renounces her to live as a white person, was seen as unbelievable. (“If by accident we should pass in the street,” the daughter, played by Susan Kohner, tells her, “please don’t recognize me.” Ms. Kohner was also nominated for a best supporting actress Oscar.)
But the film has since been re-evaluated and given high marks by many film historians and critics for the subtlety of its social criticism and psychological insight.
Ms. Moore’s performance, in particular, has earned her generations of new fans, said Foster Hirsch, a professor of film at Brooklyn College who has organized several academic conferences on “Imitation of Life.”
“She delivers an astounding performance,” Mr. Hirsch said. “She does a death scene that still reduces audiences to tears — I have seen it many times.”
But after she was nominated for an Oscar, Ms. Moore told The Los Angeles Times in 1967, the work seemed to dry up. “The Oscar prestige was fine, but I worked more before I was nominated,” she said. “Casting directors think an Oscar nominee is suddenly in another category. They couldn’t possibly ask you to do one or two days’ work.”
It would be a decade more before black actresses like Ms. Moore would be considered for major roles, Mr. Hirsch noted.
Ms. Moore was born in Greenwood, Miss., on Oct. 19, 1914, and raised in South Central Los Angeles, the youngest of Harrison and Ella Moore’s eight children. After graduating from high school and spending a few months at Los Angeles City College, she decamped for New York in search of a stage career.
She became a dancer. Throughout the 1930s and ’40s she performed in the elaborate stage shows of nightclubs in Harlem, including the Cotton Club, and in Paris and London, before returning to Los Angeles. She studied acting at the Actors’ Laboratory and began getting small, uncredited parts in films, like that of a maid and an African tribeswoman. She was already in her mid-30s by the times she made her film debut, in Elia Kazan’s “Pinky” (1949), also a film about race. (Throughout her career she hid her true age, saying she had been born in 1922.)
After “Imitation of Life,” she appeared in television dramas and in films including “Walk on the Wild Side” and “The Singing Nun.” She appeared on Broadway in James Baldwin’s play “The Amen Corner” in 1965 and in a London production of “A Raisin in the Sun.” And she was active on the Los Angeles stage, performing with the Ebony Showcase Theater and the Cambridge Players.
Mr. Kelleykahn, her grandson, is her only immediate survivor. Ms. Moore’s first husband, the dancer Nyas Berry, died in 1951. Her second husband, Charles Burris, a Los Angeles bus driver, died in 2001.
Sam Staggs, author of the 2009 book “Born to Be Hurt: The Untold Story of ‘Imitation of Life,’ ” said in a phone interview on Friday that Ms. Moore’s performance was the major reason for the film’s box-office success (it was one of the most successful movies made to that point by Universal Studios).
People came in droves to watch in the dark and weep, Mr. Staggs said: “There are many, many people alive today who remember crying at her performance, but who could not tell you her name.”
On Sept. 20, 1958, the Rev. Dr. Martin Luther King Jr., then emerging as the leader of the civil rights movement, was autographing copies of his new book in a Harlem department store when a woman approached to greet him. He nodded without looking up. Then she stabbed him in the chest with a razor-sharp seven-inch letter opener.
NYC Health and Hospitals Corp.
Dr. W.V. Cordice Jr.
John Lent/Associated Press
Dr. Cordice operated on the Rev. Dr. Martin Luther King Jr. at Harlem Hospital.
Dr. King, then 29, was taken to Harlem Hospital, where three surgeons went to work. The blade had missed his aorta by millimeters, and doctors said a sneeze could have caused him to bleed to death. After mapping out a strategy, they used a hammer and chisel to crack Dr. King’s sternum, and repaired the wound in two and a half hours.
On Dec. 29, the last surviving surgeon from that hospital team, Dr. W. V. Cordice Jr., died at 94 in Sioux City, Iowa, his granddaughter Jennifer Fournier said. He had moved to Iowa in November to be near family.
“I think if we had lost King that day, the whole civil rights era could have been different,” Dr. Cordice said in a Harlem Hospital promotional video in 2012.
New York’s governor at the time, W. Averell Harriman, who raced to the hospital to observe the surgery, had requested that black doctors be involved if at all possible, Hugh Pearson reported in his 2002 book, “When Harlem Nearly Killed King.” Dr. Cordice and Dr. Aubré de Lambert Maynard, the hospital’s chief of surgery, were African-American. The third surgeon, Dr. Emil Naclerio, was Italian-American.
Over the years, Dr. Maynard was widely credited with saving Dr. King — and he accepted that credit — but in a 2012 interview with the public radio station WNYC, Dr. Cordice said that he and Dr. Naclerio had performed the surgery.
“We were not going to challenge him, because he was the boss,” Dr. Cordice said of Dr. Maynard.
Alan D. Aviles, the president of the New York City Health and Hospitals Corporation, suggested that Dr. Cordice’s modesty may also have kept him from getting the credit he deserved. “It is entirely consistent with his character that many who knew him may well not have known that he was also part of history,” Mr. Aviles said in a statement.
At the time of the stabbing, Dr. King was promoting his book “Stride Toward Freedom: The Montgomery Story,” which recounted the successful boycott he helped lead to desegregate buses in Montgomery, Ala. His assailant was a mentally disturbed black woman who blamed Dr. King for her woes. Dr. King forgave her and asked that she not be prosecuted. He later learned that she had been committed to a hospital for the criminally insane.
John Walter Vincent Cordice Jr. was born in Aurora, N.C., on June 16, 1919. His father, a physician, worked for the United States Public Health Service there, fighting the flu epidemic of 1918. The family moved to Durham, N.C., when John was 6. He graduated from high school a year early, and then from New York University and its medical school.
With the outbreak of World War II, he interrupted his internship at Harlem Hospital to serve as a doctor for the Tuskegee Airmen, the famed group of African-American pilots. After the war, and after completing the internship, he held a succession of residencies. In 1955-56 he studied in Paris, where he was part of the team that performed the first open-heart surgery in France.
Dr. Cordice later became chief of thoracic and vascular surgery at Harlem Hospital, the position he held when he treated Dr. King. He went on to hold the same post at Queens Hospital Center. He was president of the Queens Medical Society in 1983-84.
Dr. Cordice, who lived in Hollis, Queens, for many years before moving to Iowa, is survived by his wife of 65 years, the former Marguerite Smith; his daughters, Michele Boykin, Jocelyn Basnett and Marguerite D. Cordice; his sister, Marion Parhan; six grandchildren; and six great-grandchildren.
Dr. Naclerio died in 1985, Dr. Maynard in 1999.
Dr. King wrote thank-you letters to all three surgeons. In his last public speech before his assassination in 1968, he reflected on the implications of his surviving the stabbing.
“If I had sneezed, I wouldn’t have been around in 1960, when students all over the South started sitting in at lunch counters,” he said. “If I had sneezed, I wouldn’t have been around here in 1961, when we decided to take a ride for freedom and ended segregation in interstate travel.
“If I had sneezed, I wouldn’t have been here in 1963, when the black people of Birmingham, Ala., aroused the conscience of this nation, and brought into being the Civil Rights Bill. If I had sneezed, I wouldn’t have had a chance later that year, in August, to try to tell America about a dream I had.”
This article has been revised to reflect the following correction:
Correction: January 4, 2014
An earlier version of this article gave an incorrect age for Dr. W. V. Cordice Jr. He was 94, not 95.
PHIL EVERLY, HALF OF A PIONEER ROCK DUO THAT INSPIRED GENERATIONS
By Natalia V. Osipova and Sofia Perpetua
Associated Press
Saying Farewell to a Rock Icon: Phil Everly, as half of the Everly Brothers, inspired the Beatles, Linda Ronstadt, Simon and Garfunkel and many others who recorded their songs and tried to emulate their ringing vocal alchemy.
Phil Everly, whose hits with his older brother, Don, as the Everly Brothers carried the close fraternal harmonies of country tradition into pioneering rock ’n’ roll, died on Friday in Burbank, Calif. He was 74.
Las Vegas News Bureau/European Pressphoto Agency
Phil Everly, left, and his brother, Don, performed at Caesars Palace in Las Vegas in 1970.
The group’s official website said he died in a hospital near his home in Southern California. His son Jason said the cause of death was complications of chronic obstructive pulmonary disease.
With songs like “Wake Up Little Susie,” “Bye Bye Love,” “Cathy’s Clown,” “All I Have to Do Is Dream” and “When Will I Be Loved?,” which was written by Phil Everly, the brothers were consistent hitmakers in the late 1950s and early 1960s. They won over country, pop and even R&B listeners with a combination of clean-cut vocals and the rockabilly strum and twang of their guitars.
They were also models for the next generations of rock vocal harmonies for the Beatles, Linda Ronstadt, Simon and Garfunkel and many others who recorded their songs and tried to emulate their precise, ringing vocal alchemy. The Everly Brothers were inducted into the Rock and Roll Hall of Fame in its first year, 1986.
The Everlys brought tradition, not rebellion, to their rock ’n’ roll. Their pop songs reached teenagers with Appalachian harmonies rooted in gospel and bluegrass. Their first full-length album, “The Everly Brothers” in 1958, held their first hits, but the follow-up that same year, “Songs Our Daddy Taught Us,” was a quiet collection of traditional and traditional-sounding songs.
They often sang in tandem, with Phil Everly on the higher note and the brothers’ two voices virtually inseparable. That sound was part of a long lineage of country “brother acts” like the Delmore Brothers and the Monroe Brothers. In an interview in November, Phil Everly said: “We’d grown up together, so we’d pronounce the words the same, with the same accent. All of that comes into play when you’re singing in harmony.”
Paul Simon, whose song “Graceland” includes vocals by Phil and Don Everly, said in an email on Saturday morning: “Phil and Don were the most beautiful sounding duo I ever heard. Both voices pristine and soulful. The Everlys were there at the crossroads of country and R&B. They witnessed and were part of the birth of rock and roll.”
The Everly Brothers’ music grew out of a childhood spent singing. Phillip Everly was born in Chicago on Jan. 19, 1939, the son of a Kentucky coal miner turned musician, Ike Everly, and his wife, Margaret. The family had left Kentucky, where Don Everly was born in 1937, for musical opportunities in Chicago. They soon moved on to Iowa, where Ike Everly found steady work playing country music on live radio. In Shenandoah, Iowa, Ike Everly got his own show — at 6 a.m. on the radio station KMA — and in 1945, “Little Donnie” and the 6-year-old “Baby Boy Phil” started harmonizing with their parents on the air. They went to school after they performed.
The Everly family moved on to radio shows in Indiana and Tennessee. In 1955 the teenage brothers settled in Nashville, where they were hired as songwriters before starting the Everly Brothers’ recording career.
They had a blockbuster in 1957: “Bye Bye Love,” a song written by the husband-and-wife team Felice and Boudleaux Bryant. It reached No. 1 on the country chart, No. 2 on the pop chart and No. 5 on the rhythm and blues chart, selling over a million copies. They followed it with another Bryants song, “Wake Up Little Susie,” that was a No. 1 pop hit and another million-seller. For the next few years, they were rarely without a Top 10 pop hit. Among them were “All I Have to Do Is Dream” in 1957, “Bird Dog” and “Devoted to You” in 1958, “(Till) I Kissed You” in 1959, and, in 1960 “Let It Be Me,” “Cathy’s Clown” (written by Don and Phil Everly) and “When Will I Be Loved.”
Their hitmaking streak ended in the United States in the early 1960s, lasting slightly longer in Britain. But they continued to tour and make albums, notably the 1968 “Roots,” a thoughtful foray into country-rock that included a snippet of a 1952 Everly family radio show. They had a summer variety series on CBS in 1970.
But the brothers were growing estranged. In 1973, at a concert in California, Phil Everly smashed his guitar and walked offstage, and Don Everly announced the duo’s breakup. They recorded solo albums for the next decade before reuniting in 1983, with a concert at the Royal Albert Hall in London that was filmed as a documentary. They returned to the studio for a 1984 album, “EB84,” that was produced by the British pub-rocker Dave Edmunds and included a song written for the Everlys by Paul McCartney; they made two more studio albums in the 1980s.
Among musicians the Everlys had generations of admirers. The Beatles included Everly Brothers songs in their live sets and modeled the vocal harmonies of “Please Please Me” on “Cathy’s Clown.” The Beach Boys recorded the Everlys song “Devoted to You.” Linda Ronstadt had a Top 10 hit with “When Will I Be Loved” in 1975. On his four-album set “These Days” in 2006, the country songwriter Vince Gill recorded a duet with Phil Everly, “Sweet Little Corinna.”
Simon and Garfunkel included “Bye Bye Love” on their “Bridge Over Troubled Water” album, and years later brought together the Everly Brothers to be their opening act for their 2003 “Old Friends” tour. “I loved them both,” Mr. Simon wrote. “Phil was outgoing, gregarious and very funny. Don is quiet and introspective. When Simon and Garfunkel toured with the Everlys in 2003, Art and I would take the opportunity to learn about the roots of rock and roll from these two great historians. It was a pleasure to spend time in their company.”
The Everly Brothers played their last headlining tour in 2005 in Britain. They were also heard together on a 2010 album by Don’s son, Edan Everly, in a dark song about child stardom called “Old Hollywood.”
Phil Everly is survived by his brother and by their mother, Margaret Everly; his wife, Patti; his sons, Jason and Chris; and two granddaughters.
In 2013, younger musicians released two albums of Everly Brothers songs: “What the Brothers Sang” by Dawn McCarthy and Bonnie Prince Billy (the indie rocker Will Oldham), and “Foreverly” by Norah Jones and Billie Joe Armstrong of Green Day, a remake of every song on “Songs Our Daddy Taught Us.”
“The Everly Brothers go way back far as I can remember hearing music. Those harmonies live on forever,” Mr. Armstrong posted on Twitter.
“I always thought I’d be the one to go first,” Don Everly wrote in a statement to the Associated Press. “The world might be mourning an Everly Brother, but I’m mourning my brother Phil.”
From left, James Avery, Will Smith and Janet Hubert in the popular 1990s NBC sitcom “The Fresh Prince of Bel-Air.”
By PETER KEEPNEWS
Published: January 1, 2014
James Avery, who played Will Smith’s pompous but well-meaning uncle on the popular 1990s sitcom “The Fresh Prince of Bel-Air,” died on Tuesday in Glendale, Calif. He was 68.
The cause was complications of heart surgery, said his mother, Florence J. Avery.
Mr. Avery played Philip Banks, a wealthy lawyer (later a judge) who becomes a surrogate father to his street-smart nephew, played by Mr. Smith, on “The Fresh Prince of Bel-Air,” seen on NBC from 1990 to 1996.
Reviewing the show in The New York Times early in its first season, John J. O’Connor said Mr. Smith — at the time a popular rapper just beginning his acting career — was “frequently overshadowed by the rest of the cast, particularly James Avery as the father and Karyn Parsons as the older daughter, Hilary.”
A classically trained actor with an imposing physical presence and a resonant voice, Mr. Avery was born on Nov. 27, 1945, in Pughsville, Va., near what is now Suffolk, and grew up in Atlantic City. In the late 1960s, he served in the Navy.
In addition to “The Fresh Prince,” his numerous television credits include “L.A. Law,” “The Closer” and “That ’70s Show.” Among the movies in which he appeared were “Fletch” and “8 Million Ways to Die.”
He was a busy voice artist in animated films and television shows as well. He was the voice of Shredder, the title characters’ nemesis, on the 1987-96 cartoon series “Teenage Mutant Ninja Turtles.”
In addition to his mother, Mr. Avery’s survivors include his wife, Barbara.
January 2, 2014 | Spotted on New Year’s Eve by a telescope in Arizona, a small asteroid struck Earth over the Atlantic Ocean — apparently unnoticed — just one day later. > read more
January 3, 2014 | Our lifeline to outer space began as a temporary, manually run network of radio dishes. The Deep Space Network, which turns 50 this year, now communicates with spacecraft traversing the whole solar system. > read more
December 30, 2013 | Sky & Telescope predicts that 2014’s best meteor shower won’t be one of the traditional displays. Instead, on May 24th the predawn skies over North America might come alive with a robust display of “shooting stars” shed by Comet 209P/LINEAR. > read more
December 30, 2013 | This year features three celestial cover-ups that favor North Americans: total lunar eclipses on April 15th and October 8th, and a partial solar eclipse on October 23rd. > read more
December 17, 2013 | Venus usually appears pretty boring through a telescope. But from mid-December to mid-February it’s a spectacularly long, thin crescent. > read more
December 27, 2013 | Start the new year right with a little evening stargazing! Venus is dropping from sight low in the west just as Jupiter and mighty Orion are ascending in the east. > read more
January 3, 2014 | This is Jupiter’s week to shine — it’s at opposition! Also lighting the evening sky is the Moon, waxing past first quarter on Tuesday the 7th. > read more
John Darkow has been a professional cartoonist for over 20 years, spending the last 10 as the staff cartoonist at the Columbia Daily Tribune. He is syndicated internationally by Cagle Cartoons.