In the newspaper business, some have characterized the Jayson Blair
plagiarism case as a wake-up call. The New York Times reporter
made up datelines, scenes, quotes, conversations, even his expense
account entries. But we seem to have dozed through dozens of alarms;
the cheating continues in waves.
its The New York Times, and Blairs deception
was so egregious, its presumed this is the case that will
make a difference. I hope so. But I dont think so.
How and why
things got so messed up at the Times and who is responsible
is not that befuddling. The reasons range from the obvioustop
editors ignored their lieutenants well-founded and unambiguous
advice to fire Blairto conjecture that former newsroom chief
Howell Raines, who quit in June, forgave Blairs gross and
growing errors because Blair was a favorite and an African American
whose success would reflect well on the boss.
There is probably
some truth here. Whats for sure: Editors dropped the ball
in not only failing to fire Blair, but also in assigning him the
nations hottest headline, the D.C.-area sniper story. Blairs
bosses were sloppy, on one occasion failing to question a sketchy
scoop based on five anonymous sources.
Even when wrongdoers
are caught, too many editors and educators look for reasons to excuse
this behavior, to rationalize it, to give a second chance. Their
argument: Consider the circumstances. Every case is different. There
are nuances of intent, degrees of malfeasance, just like in any
crime. Show a measure of mercy.
Should a 19-year-old
student cheater get the death sentencekicked out
of school? Or should school administrators parse the circumstances
and consider an F for the assignment or for the course, or perhaps
a semester-long suspension?
How about veteran
reporters? Should years of service be disregarded? Should management
devote weeks to checking out every story the accused has written
(as the Times did with Blair)?
and journalism schools need a penal code, a range of punishments
so everyone knows what sanctions fit what crimes? Yes. And the more
precise these are, the less the inclination to find exceptions,
to make excuses, to commit mistakes of the heart or of expedience.
The trends argue to get tougher. Course syllabi explain fabrication
and plagiarism, and professors review them in class. University
handbooks are clear. Yet, my own classroom experience has been discouraging.
In a couple of years I have dealt with two confirmed cases of plagiarism
and I have been unaware, I am sure, of others.
I and others
overestimate our ability to spot plagiarists. I informally polled
30 students in reporting classes, and in their anonymous responses
they estimated half their classmates had made up or stolen something
for a story. A number of my colleagues, though genuinely concerned,
raised doubts about what these responses really said. But if I were
alarmist, perhaps they were too quick to discount the students
self-reports. These surveys were hardly scientific, one argument
went, and wouldnt the students tend to overestimate the problem
anyway? Perhaps. But with every day, reality and common sense argue
otherwise. I was particularly shocked by a case in my classroom
last year, committed by a student I would have pegged least likely
to cheat. So much for journalists gut instinct.
What are we
to do? Here are some ideas:
plagiarism clearly and circulate the standards to everybody. Set
out the rules and the consequences for violating them in writing,
and commit to enforce them, whether the violator is the best young
prospect, the most veteran and beloved reporter, or the student
with the highest GPA. Journalism schools should require students
to take one-credit classes devoted to this subject. The argument
against thisthat we cant afford to cram in another credit
hour and that plagiarism education should be incorporated into every
classwould be a fine pitch to make, except thats what
were doing now.
more ombudsmen at newspapers, and employ the equivalent in journalism
schools. These in-house watchdogs check out reader complaints or
shoddy or questionable practices, and they are a good guard against
sloppy and dishonest work. Newspapers are supposed to be public
watchdogs; yet they resist someone watching them. Many editors
argue that the readers and news sources keep an eye on the newspaper.
But the Times experience shows readers have come to
accept errors, or have been ignored when they object. The Times,
which always has resisted an ombudsman, finally gave in this summer
and announced on July 30 that it will hire a public editor.
contracts with students in classrooms and with professionals in
newsrooms. Some schools have their students initial pledges acknowledging
they know the rules and the penalties. Would a get-tough policy
offend students? Some, perhaps. But honest students know a lot of
cheating goes onthey have told me soand they are as
depressed and angry as anyone.
There have been
grand-scale incidents like this before: 20 years ago, Washington
Post writer Janet Cooke wrote fiction so good it won the Pulitzer
Prize. The Washington Post survived. The Blair case will
not lead to the demise of the Times, or deliver a permanently
damaging blow to the industry. Or will it?
If life is 10
percent what happens to us and 90 percent how we respond, now is
the time for us to seriously get to work on the 90.
is the new chair of the newspaper department at the S.I. Newhouse
School of Public Communications.
health care system in the United States is a study in
contradictions. The World Health Organization, in its recent report
on the status of health systems around the world, rated ours first
in the world in terms of responsiveness (quality of basic amenities,
choice, dignity, and prompt attention), but only 55th in terms of
financial fairness (a measure based on fairness of financial contribution
and risk protection). Most Americans believe they are entitled to
health care, most are fairly satisfied with the health care they
receive, and most have financial access to health care of the highest
quality. Yet roughly 45 million people1 in 6 people under
age 65do not have health insurance and, as a result, often
experience problems accessing the care they need. Millions more
are underinsured because their insurance limits the amount and type
of benefits to which they are entitled.
The health care
industry in the United States is a hodgepodge of public and private
interests. Slightly less than half of health care is financed by
the public sector (Medicare and Medicaidpublic insurance for
the elderly and poor, and public institutions that provide direct
health care services, such as Veterans Administration hospitals
for military veterans). Slightly more than half of health care is
financed by the private sector (mainly through employment-related
insurance subsidized by employers who have shouldered a significant
portion of the cost of health care since World War II). Likewise,
our hospitals and nursing homes are a mix of both public and private
interests: the private organizations include both nonprofit agencies
driven by a mission to treat the sick or the poor and underserved,
and private, for-profit enterprises whose primary obligation is
to make money for their shareholders. Physician practice is, by
and large, private business.
Our health care
system makes the latest and greatest technologies readily available
to us, but they are expensive. The cost of personal health care
services in the United States is the highest, by any measure, of
any country in the world. Largely as a result of this high cost,
insurance premiums have risen to the point at which many people
cannot afford to purchase even employer-subsidized health insurance.
And employers, whose premium costs are included in the price of
their products, find they are less competitive in the global marketplace.
For the past 25 years, a variety of initiatives has been undertaken
by federal, state, and local governments, and the private sector
to reduce health care spending (or at least reduce the rate of increase
in spending). None of these initiatives has produced lasting results.
We are now in
an escalating debate about the cost of health care. Employers are
trying to cut their spending on health care coverage for workers
and retirees. Government, at all levels, is looking to reduce costs
by cutting payments to providers, limiting entitlements for participants,
and closing public health care institutions. Meanwhile, consumer
demand remains unabated.
The number of
uninsured Americans, particularly during this slow economy, has
increased. Cross-subsidies between private, paying patientsto
cover charity care and inadequate payment levels by insurance plansand
government payors, which we have relied on for years, are drying
up. There have been calls for reform, but no agreement
on what reform means or what model of health care delivery is socially,
politically, and economically acceptable.
At the core,
there is no health care system. We lack both a well-articulated
statement of what we, as Americans, expect of health care and a
policy that reflects and grows out of that statement. Our health
care industry evolved in a rather haphazard way. The delivery system
has responded to consumer demand and market forces by providing
more and more service. Public financing for segments of the population
has been made available as the political will of the time dictated.
Growing costs have strained the ability of the private financing
system to play the same role it did in the past. But never did we
develop and implement overarching policies that focus on what we
want from a health care system.
have it all. We cant have the high quality of care we have
become accustomed to, as well as universal coverage and low costs.
Unless we agree either that unlimited resources should be made available
to provide health care, or that its acceptable to exclude
entire groups of the population from financial protection, we must
address the only remaining optionthe reality that it will
be necessary and efficient to ration services.
Thomas H. Dennison,
Ph.D., teaches in the Program in Health Services Management at the
a housing development in Brooklyn, students involved in the School
of Architectures Community Design Center (CDC) got a firsthand
look at what is usually a textbook lesson about urban architecture.
Reactions to the 1971 complex designed by contemporary architecture
critic Kenneth Frampton varied among the students. One claimed it
was scary and the worst neighborhood hed ever seen. Another
concluded that despite the architects intentions, the well-designed
development hadnt changed the culture of poverty and crime
in the neighborhood. Such visits to housing developments allow students
to evaluate architecture in its social and urban contexts, enabling
them to make a complex assessment of both design intentions and
applications. In this case, the design was spawned from the architectural
and social ambitions of the 1960s, an era in which architecture
was thought to have redemptive possibilities.
In the past
year, the CDC has twice undertaken research that examines the New
York State Urban Development Corporation (UDC). Between 1968 and
1974, this state-sponsored entitywhich SU Trustee H. Douglas
Barclay G61, H98 helped create as a state senatorcompleted
115 housing projects, accommodating more than 100,000 people in
55 communities from New York City to Buffalo. Students from art,
public affairs, and architecture participated in the CDC research
initiative and studied the sociopolitical and architectural implications
study, or more generally, the production of housing in the United
States, as either an architectural or urban social issue, is not
one that heralds the attention of the academy or architects, as
it did when the UDC was established. At that time, housing initiatives
had the support of a government and a public that understood their
link to the War on Poverty and the Great Society programs of President
Lyndon B. Johnson. Architecture was understood as a positive manifestation
of progressive social policy, one in which housing was a human right,
and this philosophy drove UDC administrators. The mission to build
high-quality housing that embodied social aspirations led the UDC
to hire young, inventive architects who often united social and
architectural/urban concerns in their designs.
Though the quality
of neighborhoods built by the UDC was uneven, the zeal and ambition
of its administrators brought public attention to housing as an
architectural and social issue. In 1973, the Museum of Modern Art
in New York City exhibited a UDC-sponsored study by Peter Eisenmans
Institute for Architecture and Urban Studies that examined new prototypes
for public housing. The UDC also initiated one of the most visible
architectural events of the eraa housing design competition
for Roosevelt Island, a long strip of land in the middle of the
East River. It generated interest from young architects around the
world, including Rem Koolhaas and Richard Meier.
But the optimism
and social progressivism of the early 60s yielded to social
unrest that threatened political stability later in the decade.
American housing projects, or towers in the park, as
they came to be known, that were at one time considered part of
the solution to social inequality began to be discredited. Critics
reassessed the relationship among urban housing type, economic stability,
and social welfare, prompting policy makers and architects to rethink
Modernist models. In 1973, five years after the UDCs creation,
President Richard M. Nixon pulled the plug on federal housing subsidies,
eliminating the capacity of agencies like the UDC to function effectively,
even with substantial contributions of private investment partners.
Since that time,
federal spending on housing has shrunk to historically low levels,
and homelessness in the United States has increased. Opportunities
to design public housing are infrequent. Perhaps because of that,
American architecture school curricula often neglect urban housing,
so few students are adequately exposed to the subject. Yet, housing
is as important an issue as it ever was; approximately 75 percent
of the built fabric in cities is residential.
Since the UDCs
demise in the early 70s, the only significant consideration
given to rethinking low- and moderate-income housing models has
been generated by the New Urbanist agenda. Under the direction of
former Secretary of Housing and Urban Development Henry Cisneros,
the New Urban model was employed to transform the failed public
housing built in the 50s and 60s. But the New Urban
fix to the American version of Modernist projects, whose
formal problems were exacerbated by social and economic ones, does
not serve dense urban populations well. The only way for that to
happen is for government to reengage in the business of housing
as a demonstration of its commitment to essential human rights.
As a facilitator
of invention and technological advance, the government also has
a role. Though the UDC had many failings, its objectives may serve
as a model for contemporary urban housing programs when they do
reemergeobjectives that promote technological and architectural
invention, that examine the relationship between urban form and
There are no
simple answers. Housing is a large proportion of our built environment.
It is important that a new generation of public policy makers and
architects take interest in, and gain knowledge of, housing issues.
Perhaps by exposing young architects to these issues, urban housing
will again get the attention it deserves.
B. Arch., M. Arch., is an assistant professor at the School of Architecture
and director of the Community Design Center. A licensed architect,
she has practiced in New York City, Boston, and Florence, Italy. Her
research focuses on urban housing and residential block design.