Neighborhood Information is NOT Just for the Experts

In 1997 a group of housing activists and their supporters came to focus attention on the deterioration of residential conditions in Los Angeles. Self-named the Blue Ribbon Citizens Committee on Slum Housing, the group undertook research and advocacy to change the way that public agencies addressed a growing local problem. Two years earlier, the Census’s American Housing Survey had reported that in the Los Angeles-Long Beach area there were 154,500 substandard apartments in need of major repair, 107,900 infested with rats, and 131,700 units without working toilets. In part to support the work of this Citizens Committee, the Neighborhood Knowledge Los Angeles (NKLA) telecommunications project was begun to create an integrated internet-based information system to pinpoint disinvestment and decay in LA communities at the property level. Trainings on how to use the new system were targeted both at political decision makers and at grassroots community based organizations.

In the summer of 1998, under pressure from advocates using the newly available housing information, the City of Los Angeles enacted a comprehensive code enforcement program to inspect all rental properties in the city with two or more units. This legislation was heralded by the LA Times as “the most important housing reform in the City’s history.” Since that time, more than 150,000 dwellings have been inspected and more than 400,000 specific code deficiencies remedied. Today, nearly 300 nonprofit organizations are registered on the NKLA site (http://nkla.sppsr.ucla.edu) and they use the information to identify properties for acquisition and rehab, for tenant organizing and homeownership counseling and to meet other community development objectives.

NKLA receives about 250,000 hits each month, with approximately 100 different individual users on the site each day. NKLA users can be divided into four primary categories: educators, researchers and students (30 percent) government staff (20 percent), nonprofit organizations (20 percent), and community residents (10 percent). Forty percent of users identify conducting property research as their primary aim. Other uses including assembling neighborhood information and collecting data for grant applications and reports.

Putting Data to Work

NKLA, which grew out of a UCLA Community Outreach program in 1995, is a GIS-based project (Click here for an explanation of GIS) whose goals are coordinating public information in ways that assist neighborhoods; increasing government transparency to all residents, especially those who can’t afford private access; and narrowing the digital divide by providing reasons that low-income users would want to learn how to use new technologies.

The NKLA website is the project’s centerpiece. It is built on a collection of “early warning” databases, made of information from the LA County Tax Collector (delinquencies) and city agencies, including the Departments of Building and Safety (nuisance abatement, code complaints and building permits), Water and Power (utility liens), and Housing (federally-subsidized expiring use properties, and buildings where tenants are withholding rent to force housing improvements). Census data provides context.

All of this information is searchable by property address, street range, census tract, zip code and interactive mapping. Each of the databases can be mapped against any area within the boundary of the City of Los Angeles. The site also includes relevant links to neighborhood information and descriptions of how NKLA has been used locally to address neighborhood issues. Over the last year a new “Policy Room” has been built on the site, enabling users to query multiple databases. For example this new section can answer a citywide question, such as, “Which census tracts have the highest concentration of both tax delinquency and code complaints?”

Keeping it Accessible

Because there is an significant “digital divide” between information “haves” and “have-nots,” the development of the NKLA website has always been integrated with outreach and training activities. From the outset, the NKLA site has been available to users in English and Spanish. Over the last year, funds have been made available through private foundations to conduct targeted outreach in two low-income communities located in South Los Angeles and East Los Angeles. This includes helping local community development corporations more effectively use NKLA data and other applications for things like site acquisition, identifying homeowners at risk of foreclosure and getting them needed services, and helping youth explore and express their neighborhoods through the use of new technologies.

Efforts have been made to use experience in the field to inform site content and design. For example, many community residents expressed the view that the NKLA site, with its focus on disinvestment and deterioration, did not accurately portray the strengths within LA’s lower income communities. Moreover, a project called “Neighborhood Knowledge” should not just be the city’s knowledge of neighborhoods, but should bring out the voice and perspective of communities. So a new portion of the site – “Interactive Asset Mapping for Los Angeles” (I AM LA) – was created that enables community residents to directly input neighborhood assets onto the NKLA site through a web-based interface.

With the establishment of the City’s new code enforcement program, a new opportunity arose to develop an improved system for housing data collection and a chance to make more data available to community users. The same group of UCLA researchers was hired by the City of Los Angeles to build a new information system for the city’s first comprehensive residential inspection program, but they agreed with the understanding that data produced could be made available to the public via the NKLA web site. The aim is to increase transparency so that low-income tenants can go to their branch library and track inspection processes not unlike those with higher income track their Federal Express® packages.

The Nitty Gritty

While conditions may vary city-to-city, the political environment is likely to be more favorable toward projects like NKLA now than six years ago when NKLA began. A growing movement towards new e-government technologies has begun to take hold. Staff who routinely use e-mail now expect easier access to all necessary departmental information from their workstations. Public information projects like NKLA have also established important precedents, and public agencies find it more and more difficult to argue that government digital information is not “public.”

Increasingly, we hear from groups who are interested in creating their own project like NKLA. Four of the most “Frequently Asked Questions” involve access to data, project funding, technology requirements and costs, and technical capacity.

How did you get the government databases?

Most of the data for the NKLA project was spread throughout the City of Los Angeles primarily on desktop computers where a database system had been developed by one staff member to address a particular set of administrative tasks. In general, this data was not on any shared information system (mainframe, network, etc.). In fact, each system was often seen as almost proprietary by the staff who developed it. We went to each of the departments that dealt with housing issues and asked line staff to identify who was the “keeper of the information.”

Our first plan was to then follow the chain-of-command by going to the appropriate administrative heads and asking them to order delivery of the data from the identified staff person. But this would have created delays and made it difficult to get regular updates. It isn’t easy for outsiders to get the boss to give orders and few like to receive orders from above.

Instead, we worked directly with the individuals who built and managed each database system, asking them if they were interested in seeing a database from another department. We spent time building trust – talking about technical matters, identifying the meaning of each data field, complimenting staff members for their initiative in creating the system. We explained repeatedly that making some of the relevant fields public would mean smaller lines at the counters and less daily disruption in their work flow. In a reversal of the top-down strategy, when someone agreed to copy a data disk for us, they often sent an e-mail message or memo upstairs declaring their intention.

It was certainly helpful that we were seen as university researchers and not another competing bureaucrat that was going to use the data to consolidate municipal power. It is likely that as a third party we were able to acquire information more easily than if the Housing Department had attempted to do this on its own.

The County of Los Angeles presented more of a problem. We needed a way of building a property identification backbone and the best solution was data that connected Assessor Parcel Numbers (APNs) to specific addresses. The APNs were to be the unique identifiers for integrating various data sets into a property-based information retrieval system. But the County sells its electronic assessor roll to private purveyors of real estate information, so there was considerable concern that UCLA would make such data freely available over the internet. An agreement was reached to sell UCLA a narrow strip of the Assessor data that wouldn’t impinge upon the broader marketability of the Assessor’s information products.

How did you raise the money to get started?

NKLA first emerged from one component of a HUD University-Community partnership grant (approximately $50,000). Through working with the community, we were able to identify some of the worst occupied slum properties, as well as vacant and abandoned nuisance sites. Repeatedly, they showed unpaid tax liens, utility bills, and numerous code complaints. The HUD grant paid for some of the underlying research into what data sets were useful indicators of disinvestment and funded a very simple “beta,” or pilot, NKLA website.

Using this pilot site, as well as Chicago’s online Neighborhood Early Warning System (www.cnt.org), as examples, we proposed to the City of Los Angeles Housing Department that it sponsor a new kind of policy research. Previously the City used federal resources to pay for formal research reports. We emphasized that our online research could be more easily utilized by both government and the public-at-large, as well as more easily updated. The City was convinced, and provided another $29,000. (A simple system could be built in most cities today for $50-$100,000.)

One year ago, the NKLA site was entirely redeveloped and expanded. Funding for this came primarily through the US Department of Commerce National Telecommunications Information Agency under what is now called the TOP (Technologies Opportunity Program). The City also augmented its funding, paying for data integration and the “Policy Room” project components.

What hardware and software tools do you use? How much do they cost?

In order to create, maintain, and access a website such as NKLA, you need to have certain computer hardware and software. For developing and maintaining a website, you either need a server or space on a Web server through your Internet Service Provider. Because NKLA serves a lot of data and maps, we need a fairly robust server. We have a Dell PowerEdge 2300 server (about $16,000), running Windows NT Server as the operating system and Microsoft Internet Information Server as the Web server.

In order to create an interactive site you will need more advanced programming tools that the usual HTML (Hypertext Markup Language) composers like Netscape Composer or Microsoft Front Page. We use Cold Fusion Studio (approximately $400 per license) and Cold Fusion Server (approximately $1,300) from Allaire [ed note: now Macromedia]. Other options include Microsoft’s Active Serer Pages or a scripting language like Perl.

Because our data base is rather large (our table of addresses has nearly 2 million records), we use Microsoft SQL Server to house our data. For many websites, a data base tool such as Access or Paradox would probably suffice. For Internet Mapping we use Geographic Information System (GIS) software from ESRI (www.esri.com): MapObjects and VisualBasic (combined approximate cost $5,000 per license) and Internet Map Server (approximately $1,500).

How much technical expertise is required?

Should a community-based organization hire consultants or partner with universities, or can they learn enough to implement a neighborhood information system project on their own? The answer to this question is, of course, “it all depends”. A community-based research group like the Chicago’s Center for Neighborhood Technology led others (universities, cities, counties) in exploring this new community information domain. On the other hand, most community development corporations are unlikely to have on their current staff individuals with the level of data management and GIS skills necessary to undertake such projects. The UCLA team is composed almost entirely of graduates of the Urban Planning Masters program there; hence, it is possible to develop such a project with recent graduates if funds for necessary hardware, software and online access are secured.

We hope that our experience can inform others (localities, nonprofit groups, foundations, and yes, universities) who are seeking to build their own neighborhood-based information systems. We are encouraged that a California state legislator has agreed to sponsor a bill in the next legislative session to provide nonprofit organizations, local governments, universities and other public agencies with matching funds to replicate the NKLA project in other communities throughout California. The “digital divide” is not simply about access to hardware, but also access to meaningful, community-relevant content – information that can be used by neighborhoods to improve neighborhoods.

This piece was adapted from the NKLA How-To Manual.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.