Uncovering The Early History Of Big Data In 1974 Los Angeles

Uncovering The Early History Of Big Data In 1974 Los Angeles

The concept of “smart cities” seems like a contemporary urbanism trend. But as early as the 1960s, cities were using technology to gather, interpret, and visualise civic data. Here’s how a 1974 report by Los Angeles’s Community Analysis Bureau used computer databases, cluster analysis, and infrared aerial photography to help them to make decisions about policy.

In December 2013, Los Angeles Mayor Eric Garcetti issued an executive order instructing each city department to gather all the data it collects and share it on a publicly accessible website by early the following year. In February 2014, he appointed LA’s first Chief Innovation Technology Officer, and a few months later he launched DataLA, the city’s online data portal. The launch, aimed at a generation who had grown up with smart phones, the internet, and GIS mapping, was promoted with a hackathon hosted at City Hall.

Whether you call the approach “smart cities,” “intelligent cities” or “digital cities,” DataLA puts Garcetti on a growing list of mayors who believe that better use of information technology and data can help them govern cities more effectively, connect residents to city government and resources, and spur high-tech employment. Smart cities have been criticised for prioritising whiz-bang tech over residents’ basic needs and for their potential to widen the economic gap between the technology haves and have-nots. While those are real concerns, the concept of improved urban governance through better use of information is a promising one.

Like many smart, new ideas, however, it’s not new. It’s not even new to Los Angeles, which has been pursuing computer-assisted data and policy analysis for decades. Beginning in the late 1960s and through most of the 1970s, the little-known Community Analysis Bureau used computer databases, cluster analysis, and infrared aerial photography to gather data, produce reports on neighbourhood demographics and housing quality, and help direct resources to ward off blight and tackle poverty.

Uncovering The Early History Of Big Data In 1974 Los Angeles

I have been reading about the history of planning in Los Angeles for years, but the first time I had seen anything by or about the Community Analysis Bureau was when I ran across its insightful-but-weird 1974 report “The State of the City: A Cluster Analysis of Los Angeles” at a library. A data-rich snapshot of LA from forty years ago, the report didn’t categorise Los Angeles into the usual neighbourhoods or community plan areas, but into scattered clusters with names like “the singles of Los Angeles,” “the suburbs from the fifties,” “richest of the poor,” “gracious living,” and more. The nomenclature was seemingly drawn more from market research than traditional city planning reports.

I mentally filed it away as just another 1970s urban experiment, an attempt to sort and categorise places across LA’s expanse. As I read more about the methodology, however, I became intrigued by the Community Analysis Bureau’s ambition to create an “Urban Information System” that could be applied to tackle the problems of the day. I wondered whether this urban intelligence had influenced city policy or programs. How had the bureau fared as the politics of planning, poverty alleviation, and land use in the city changed? Was there a trove of lost data moldering somewhere in boxes of punch cards? I looked up documents on the history of the bureau in the city archives and located several former staff members still living in the Los Angeles area. They were gracious enough to share their memories of the bureau’s work.

Cybernetic Urbanism in the Know-How City

In a 1976 essay, British travel writer Jan Morris summed up Los Angeles as “The Know-How City:”

“Remember ‘know-how’? It was one of the vogue words of the forties and fifties… It reflected a whole climate and tone of American thought in the years of supreme American optimism. It stood for… the certainty that America’s particular genius, the genius for applied logic, for systems, for devices, was inexorably the herald of progress. There has never been another town, and now there never will be, quite like… Los Angeles… where the lost American faith in machines and materialism built its own astonishing monument.”

In the years after World War II, that know-how and faith in machines translated, in part, to an interest in computer-assisted social analysis, thanks to the availability of both mainframe computers and large federal grants during the Cold War. Social scientists in particular were interested in exploring the possibilities that data and computers could bring to public policy, as were city planners and architects. In A Second Modernism: MIT, Architecture and the ‘Techno-Social’ Moment, Arindam Dutta writes that for them, “the emphasis on assembling, collating, and processing larger and larger amounts of data” was “paramount in the postwar framing of expertise.”

Uncovering The Early History Of Big Data In 1974 Los Angeles

Data was the key to know-how, and Los Angeles was key to the techno-optimism of the era. Although the region’s lingering reputation may be for unchecked sprawl and popular entertainment, twentieth-century LA was highly planned — and proud of the systems on which it depended: its networks of streetcars and freeways, its flood control and water infrastructure, and its intentionally fragmented municipal and quasi-public governance. Southern California had a huge high-tech cluster in the aerospace industry. Even the Hollywood studios had their “system.” LA was a temple of progress, “the international symbol of the City of the Future,” as Mayor Sam Yorty put it in his introduction to a 1970 Community Analysis Bureau report. By then, the city had been tapping into the technological know-how of the region for more than a decade.

During the 1950s, the city of Los Angeles departments of planning, and building and safety had mocked up computer punch cards for a system they hoped could help track and analyse every piece of property in the city. In 1962, the city submitted a proposal to the Ford Foundation seeking funding for “A Metropolitan Area Fact Bank for the Greater Los Angeles Area.” In proposing the “fact bank,” the mayor’s office noted that Los Angeles “was one of the first non-federal government agencies to use electromechanical and electronic data processing systems in accomplishment of its day-to-day service rendering tasks… the City now staffs and operates thee solid state computers and four electromechanical data processing installations.”

The Ford Foundation rejected the proposal, but LA’s leaders were undeterred. In 1964, the city hired Calvin Hamilton as director of the city planning department, in part due to his success in bringing computer modelling to Pittsburgh. Two years later, Los Angeles applied for federal funding to launch a community analysis program that would perform “a comprehensive analysis of the entire city” in order to “prevent further inroads of a physical, economic, and social nature which contribute to… obsolescence.” The city had better luck this time. The grant was approved, and the following year, in January 1967, Mayor Yorty approved an ordinance creating “a department of City Government known as the Community Analysis Bureau.”

Bytes vs. Blight

Like many American cities, LA had been studying and trying to prevent, cure, or clear “slums” for decades. At the end of World War II, the city’s housing authority issued an annual report entitled “A Decent Home…An American Right,” which proclaimed, “No city can thrive when 176,000 of its citizens are living under unsafe and insanitary conditions. The substandard houses in Los Angeles with their filth, squalor, and foul environmental influences are a costly menace and disgrace to our city.” Planners and policy makers believed that badly maintained housing threatened the prosperity, health, and morals, not just of low-income populations living in substandard homes, but of the broader metropolitan area. This was the era of urban redevelopment: Los Angeles’s planning, health, housing, and building departments had created alarming maps showing concentrations of tuberculosis cases, housing without plumbing, juvenile delinquency, and other indicators of poverty.

Uncovering The Early History Of Big Data In 1974 Los Angeles

In forming the Community Analysis Bureau, Los Angeles sought new tools to address the old challenges of deteriorating housing by providing detailed local data to identify neighbourhoods showing early signs of obsolescence. The city had razed “blighted” housing in Chavez Ravine in the early 1950s and, when the CAB launched in the late 1960s, was using federal funding to redevelop the Bunker Hill area. The bureau’s data would help identify blighted areas across the city for renewal efforts like these and inform measures aimed at alleviating the poverty that led to blight in the first place.

Data Hunting and Gathering

The US Census Bureau had gathered and reported statistics on housing quality between 1940 and 1960. The agency stopped directly rating housing quality after finding that only one-third of the units they labelled as dilapidated would be considered so by trained housing inspectors.[19] After 1960, the Census Bureau recommended looking at other characteristics such as building age, lack of plumbing, and overcrowding to infer housing quality.

The Community Analysis Bureau adapted and developed a range of technologies and analytic approaches to assess housing (and related social) conditions to fill this void left by the Census Bureau, and provide detailed local data to identify neighbourhoods showing early signs of obsolescence. Computerised data storage and retrieval were centerpieces of the bureau’s planned work with the ultimate goal of helping policy makers plan and budget city responses.

Uncovering The Early History Of Big Data In 1974 Los Angeles

First, however, the bureau had to digitize and centralize relevant information from the US Census, the Los Angeles Police Department, the LA County Assessor, and other private and public sources using the city’s existing IBM-360 mainframe computers. As a partial step toward a comprehensive Los Angeles Urban Information System, the bureau created a database using 220 staff-identified data categories as the nucleus of its database. This eventually expanded to 550 categories available to analyse individual census tracts.

In 1974, the CAB recommended a strategy of cluster analysis to allow “the data to suggest its own ‘natural’ grouping.” Clustering could identify parts of the city that might be geographically far apart but shared important social and physical characteristics. Bureau staff chose sixty-six key items from the database, including population, ethnicity, education, housing, and crime data, and an environmental quality rating.

Using a combination of hierarchical and reallocative clustering procedures, Thomas A. Smuczynski developed the cluster analysis techniques used by the Community Analysis Bureau, and his colleagues programmed the city’s existing mainframe computers in City Hall using Fortran and COBOL. Smuczynski told me that the city was able to hire talented computer programmers in the early 1970s due to layoffs in the aerospace industry. City data was processed with the computer programs SPSS (Statistical Package for the Social Sciences) and BioMed, a data analysis program created at UCLA.

Uncovering The Early History Of Big Data In 1974 Los Angeles

Using Smuczynski’s techniques, LA’s 750 census tracts were sorted into thirty clusters. Cluster 2, for example, was “The Singles of Los Angeles.” It contained “a very young population with an average age of thirty-three, living in high-density new apartment buildings.” Seven of the nineteen census tracts in this cluster were located adjacent to one another in West Los Angeles and Brentwood. Other tracts with similar young, single, apartment dwellers were found in Palms, Baldwin Hills, Del Ray Palisades, Hollywood, and Bunker Hill.

Cluster analysis also revealed correlations between data and social outcomes. Bureau staff noticed that it didn’t’t take sixty-six data types to pinpoint which parts of the city had the worst blight and poverty. Three sets of data considered together — birth weight of infants, sixth-grade reading scores, and age of housing — emerged as an accurate indicator for housing decline and socioeconomic disadvantage.

Aerial Photography

Even with a vast array of data at their fingertips, evaluating the physical state of more than a million housing units spread out over Los Angeles’s nearly 500 square miles was an enormous challenge for the bureau — so bureau staff took to the air. A 1970 report from the bureau noted that “the use of colour infrared (CIR) aerial photography offers immediate aid as a relatively inexpensive means of locating those areas most affected by conditions of blight and obsolescence.”

The city’s aerial analysis was led by Robert Mullens II. As a graduate student at UCLA, Mullens wrote his master’s thesis on ways to analyse housing quality from aerial photography, and he was hired by the Community Analysis Bureau to refine these techniques for application in Los Angeles. He recommended that the city use colour infrared aerial photography due to its ability to penetrate haze and to show vegetation quality and small objects.

Aerial photography had evolved by this period into a widely used tool for everything from surveying land and analysing forest health to fighting the war in Vietnam. In her 2004 book From Warfare to Welfare: Defence Intellectuals and Urban Problems in Cold War America, Jennifer Light underscores the intertwined military and civilian origins of aerial photography. She explores how some of the procedures used for urban planning in the 1960s and 1970s — not just aerial photography but also computer analysis and the very metaphor of the cybernetic city — derived partly from Cold War military research. I mentioned this thesis to Gary Booher, who helped Mullens analyse aerial photos of Los Angeles at the Community Analysis Bureau. Booher replied that whatever the military-civilian cross-fertilization on urban policy, the bureau never had access to higher resolution defence department cameras and photographs.

After trial runs over Watts and parts of Northeast LA, the city contracted with aerial survey agencies to conduct overflights of the entire city in 1971 and 1978. Cameras mounted in light planes took colour infrared photos at a 1:6000 scale. Mullens and his colleagues developed a point system to rate the environmental quality of census tracts based on these aerial photos. The purpose was to measure housing quality “by photo interpretation of selected surrogate characteristics of the neighbourhood residential environment.” Each photo sheet was looked at and rated by three city staffers to derive an average score of between 1 and 10 (a lower number meant fewer problems and better environmental quality). Some of the characteristics considered were vacant land, land uses, street conditions, trash, the presence and quality of vegetation, house and lot size, and the presence of “convenience structures” like patios and swimming pools.

City staff drove through a sample of neighbourhoods to verify the accuracy of their aerial ratings. These “windshield” observation of dwellings for visual signs of dereliction, plus data from city building inspections and Los Angeles County assessors, helped tweak the environmental quality ratings. The ratings showed that the “City’s center has the worst environmental quality, with the next poorest quality classes forming concentric rings around the downtown area.” The map from the 1978 flyover shows dark areas (bad environmental and housing quality) clustered downtown and to the south and east of downtown. Estimates from this analysis showed that nearly 150,000 dwellings — 13 per cent of LA’s housing stock — could be considered substandard. Most of these needed only minor repair, but 4,807 units — 0.4 per cent of the city total — were considered truly dilapidated and beyond rehabilitation.

While Mullens and Booher still think that aerial photography and the bureau’s rating system were a fairly accurate way to locate housing in need of repair, they acknowledge, in Booher’s words, that “economic bias crept in.” Rating criteria favoured homes with residents who could afford to maintain landscaping, extend decks, and build pools. The focus on lush plantings in those 1970s assessments also seems dated; xeriscaping in today’s drought-conscious California would not rate well using their methodology.

From Analysis to Action

The bureau’s data and analyses were intended to spur interventions in the city. An early report on the bureau’s methodology used the analogy of a “thermostat that samples changes in data… and, based on these measurements, or studies, makes recommendations to operating and staff agencies of the City as to the differences in the desired City climate and the actual.” The city’s data-driven climate control would help to regulate everything from crime rates to unemployment to traffic. That broad range of recommendations reflects the ambitions of the late sixties and early seventies, when urban planners claimed a broader mandate than we are used to today. As the bureau’s first “State of the City” report explained, “It has become obvious that the traditional approach to urban renewal, the treatment only of physical problems, is not adequate… to deal with the social, economic, and physical nature of urban decay.” Recommendations from that report included raising family incomes above poverty level, placing all needy three-to-four-year-olds into preschool, and spurring the construction of 7,000 to 9,000 low-to-moderate income housing units per year, in addition to those already planned.

Uncovering The Early History Of Big Data In 1974 Los Angeles

As a kind of think tank inside city hall, the Community Analysis Bureau lacked authority to launch its ideas into action, but the timing was right to explore ways to address inequality. In 1973, Los Angeles elected Tom Bradley, who was known to be more committed to assisting disadvantaged neighbourhoods and to addressing racial disparities than his conservative predecessor Mayor Sam Yorty, as its first African American mayor. The next year, the federal government created the community development block grant program to fund redevelopment, social services, and infrastructure in high-poverty neighbourhoods. According to Romerol Malveaux, whose first job was conducting research and administering grants for the Community Analysis Bureau and who remains a community activist forty-five years later, the data that the bureau generated helped Los Angeles become the first city to receive community development block grants. LA used these funds to expand social services, maintain streets, and build libraries and parks in low-income areas. Detailed data at the census tract level even allowed the city to identify high-poverty pockets eligible for block grants in wealthier council districts.

But in the end, the bureau was a victim of its own success. The data it collected proved so useful in securing federal grant money that the city focused the bureau’s activities on grant development and administration, with continued data analysis to justify these funds. Instead of using research to guide the city’s actions, the bureau wound up reacting to the city’s predetermined goals as set out in funding applications. The bureau was folded into a new Community Development Department in the mayor’s office in 1977, where the Community Analysis and Planning Division of the department continued to issue reports until 1980, after which the “community analysis” name was retired.

Legacy and Lessons

Today, most people working on planning, housing, and economic development in Los Angeles have never heard of the Community Analysis Bureau. Having spoken to former bureau staff members and having read through some of its reports, I think that the history of the bureau — its mission, strategies, accomplishments, and shortcomings — are worth sharing. This early effort to apply computer analysis to the social and physical challenges of a big city might hold some lessons for our contemporary era of big data, smart cities, and digital urbanism.

The bureau never achieved the full ambitions of its founders to create a control panel for what we call a “smart city” today. Gary Booher, who joined the project when it shifted into the Community Development Department, described the 1970s technology as just too “embryonic” to allow real-time data to flow to decision makers who could adjust policies and practices on the fly.

Despite this limitation, staff members who helped develop the bureau’s methods believe that they were worth exploring. Thomas Smuczynski, Robert Mullins II, and Romerol Malveaux all told me that it was exciting to be working on new techniques for understanding Los Angeles. They also found it rewarding to know that their work had helped identify places and problems where grants could help improve people’s lives.

But the ultimate failure of the Community Analysis Bureau suggests that data analysis needs to be better linked to planning, policy, and even advocacy. The bureau wasn’t closely allied with social movements that might have pushed for changes related to the agency’s findings, nor was it sufficiently integrated into the structure of decision making and budgeting in the city. With no core constituency in the heart of city government, the bureau’s findings were easy to dismiss as interesting but inessential factoids. Bureau employees predicted this problem in 1970 in a report that noted, “Political realities must be very carefully amalgamated with the tools of technology. This amalgamation will be difficult at best since, by design, the conclusions of technology tend to be objective, while those of politics tend to be subjective and emotional.” The bureau might not have won any friends in City Hall with self-important statements like these, but there’s some truth there, too.

The bureau may not have brought about the technocratic decision making its early proponents hoped for, but Romerol Malveaux told me that the Community Analysis Bureau did advance equality in a Los Angeles stratified from decades of segregation by providing information on what needs existed in the city’s many neighbourhoods. There are some hopeful signs that LA’s current smart city efforts have those same inclusive goals. The winners of the hackathon launching LA’s open data portal were a team of high school student’s whose app is intended to help deliver supplies to homeless shelters. One of the city’s first funded efforts to apply an innovation approach to governance will be an effort to understand whether LA can have revitalization without displacement.

We should judge the data generated by and for smart cities by its social relevance as well as by the volume of it made available publicly and the processing power and analysis harnessed by the city, the private sector, and academia. Could digital urbanism help narrow the growing gap between society’s 1s and 0s? Will smarter cities help us steer below a 2-degree Celsius rise in global temperature? If intelligent cities and open data can advance knowledge, efficiency, equity, and sustainability, a new generation of community analysis might fulfil the techno-optimism that took root in 1960s and 1970s Los Angeles and is back with us today.

See additional notes and sources related to this piece here.


Mark Vallianatos is policy director of the Urban & Environmental Policy Institute and teaches at Occidental College. He would like to thank Thomas Smuczynski, Robert Mullins II, Romerol Malveaux, and Gary Booher for their insights, and LA City Archivist Michael Holland and UEPI research assistant Amelia Buchanan for helping locate files on the bureau.

Thoughtful, provocative, and playful, Boom: A Journal of California aims to create a lively conversation about the vital social, cultural, and political issues of our times, in California and the world beyond.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.