What can I learn from working on the City Scrapers?

A lot about cities! What is Chicago's City Council talking about this week? What are Local School Councils, and what community power do they have? What neighborhood is the police department doing outreach in? Who governs our water?

From building a scraper, you'll gain experience with:

  • how the web works (HTTP requests and responses, reading HTML)
  • writing functions and tests in Python
  • version control and collaborative coding (git and Github)
  • a basic data file format (json), working with a schema and data validation
  • problem solving, finding patterns, designing robust code

Community Mission

The City Bureau Labs community welcomes contributions from everyone. We prioritize learning and leadership opportunities for under-represented individuals in tech and journalism.

We hope working with us will fill experience gaps (like using git/github, working with data, or having your ideas taken seriously), so that more under-represented people will become decision-makers in tech and media.

Ready to code with us?

  1. Fill out this form to join our slack channel and meet the community.
  2. Read about how we collaborate and review our Code of Conduct.
  3. Get started with Installation and Contributing a spider.

We ask all new contributors to start by writing a spider and its documentation or fixing a bug in an existing one in order to gain familiarity with our code and culture. Reach out on slack for support if you need it. We also meet up in person to work together regularly, and post about upcoming meetups in slack.

For those familiar with the project, please see the help-wanted Github issues.

Don't want to code?

Join our slack channel (chatroom) to discuss ideas and meet the community!

We have ongoing conversations about what sort of data we should collect and how it should be collected. Help us make these decisions by commenting on issues with a non-coding label.

Good with Google? Help us research sources for public meetings. Answer questions like: Are we scraping events from the right websites? Are there local agencies that we're missing? Should events be updated manually or by a scraper? You can dive in today by triaging event sources on these issues.

Join us in-person at our Open Coding Sessions!

We meet every other Monday to code together, tackle issues and engage in some collective decision-making. The Open Coding Sessions are open to coders and non-coders alike. Be sure to bring a computer to each session so we can get you started.

Become a Documenter

The City Scrapers project operates as the data collection system for City Bureau's Documenters program, which pays and trains community members to document public meetings and engage in the production of journalism, news and information. You can become a Documenter by applying here (everyone is accepted) and attending at least one City Bureau sponsored training.