Description

DataSF believes in the responsible, ethical use of data to improve decision-making and service delivery. We are a small team with an ambitious agenda: to empower the use of data in decision making and service delivery across San Francisco. This work requires empathy and deep curiosity.

As a team member you will stretch and challenge yourself. DataSF’s citywide purview empowers you to tackle data problems that touch all aspects of city life: housing, transportation, justice, health, the built environment, and many more. You will have unparalleled ability to help shape our approach to data use in each of these areas.

Do you get excited about bringing data together for new insights and improved decisions? How about contributing to making San Francisco a better place to live, work and play? Looking to use your skills to make a positive social impact? Great! Then come join DataSF to empower use of data in government!

DataSF is a small, growing team working across the City and County of San Francisco. Our mission is to empower the use of data in decision-making and service delivery. We work to streamline data access through light, agile data infrastructure, improving data management and governance, boosting capacity to use data through training and data science, and connecting it all together empathetically and ethically for the greater good of San Franciscans. We accomplish this ambitious goal by hiring equally ambitious applicants. Yes, we are interested in what you have done, but we are more interested in where you see yourself going.

The City of San Francisco is flush with data - your mission will be to connect, transform and automate data sharing to support City departments and the people they serve. DataSF offers data/analytics engineering services to City departments to ensure the timely and efficient publication of data to the City’s open data platform as well as to support data science. The data platform is used by departments as well as the public to support transparency, support equity in services and programs, automate reporting and develop applications.  You’ll also support data science services to departments through the development of sustainable analytics pipelines. You can learn more about the work that goes into open data in the four part blog series on open data operations and on operating a data science program.

Analytics engineering is a critical part of keeping data fresh, standardizing datasets, and offering value-added data transformations to City departments that improve services to the residents of San Francisco. You will take a lead role in developing and executing modern analytic engineering patterns for the City. We seek someone that is excited to empower use of data, enthusiastic about open data, and a continuous learner.

Removing barriers and making it easier for all people to access services or knowledge is a core part of any role at DataSF. Beyond any technical skill set or prior work history, accomplishing this ambitious task requires an empathetic understanding of the diverse array of experiences embodied in San Francisco. Your own life experience is a critical contribution to this effort.  DataSF is committed to building a team whose diversity reflects the residents we serve.  

This is an exciting position for someone eager to harness the power of data to improve transparency, citizen engagement, and government performance; someone who is excited by DataSF’s mission of empowering the use of data in decision making and service delivery.

Skills and responsibilities

The Analytics Engineer will be responsible for maintaining, developing, and coordinating data and analytics engineering services to support the sharing of City data through the City’s data platform and to support our data science work. Your objectives and related responsibilities include:

You’ll improve data services offered to departments including the evaluation and adoption of new tools as needed
  • Continuously assessing and improving suite of data automation services, including identifying opportunities for self-service, assessing new and emerging technologies and streamlining existing business processes, including via automation
  • Leading the evaluation of new tools and approaches to improve analytics pipelines as needed
  • Helping identify the need for new data services and supporting the creation and deployment of future services
  • Developing documentation to support self-service data automation for other departments
  • Developing automation patterns that enable legacy data systems to leverage cloud-scale analytics platforms safely and securely (DataSF data portal, Snowflake, and DBT)
  • Providing clear thinking and tradeoffs on adopting new technologies and approaches that balance long term vision with day to day operations

You’ll build data & analytics pipelines to support data-driven work
  • Working with the staff to develop business transform requirements for individual datasets and consulting with departments on the best way to automate and publish datasets
  • Applying ethical lens to the appropriate use of data
  • Creating new analytics pipelines using ETL/ELT approaches per requirements and according to standards and patterns you help develop and refine
  • Serving as the technical lead for database exports, manipulation, and procedures used to create and update data incorporated into the data platform
  • Implementing analytics pipelines and/or data models to support data science and data analytics work as needed

You’ll maintain existing data pipelines
  • Monitoring existing data automations developed on Safe Feature Manipulation Engine (FME) Server, responding to incidents, and managing changes as needed
  • Identifying opportunities to migrate existing ETLs as needed
  • Analyzing pipeline throughput, issues, and other metrics to inform improvements to the automation platform

Details

  • Location:
    San Francisco
    ,
    CA
  • Salary:
    $123,734.00
    -
    $155,636.00
    yearly
  • Deadline:
    2020-08-03

Qualifications

Minimum qualifications

Preferred qualifications

DataSF encourages applications regardless of whether you think you meet 100% of the skills listed below.

Personal Skills
  • Excellent oral and written communication skills
  • Investigative ability and intellectual curiosity
  • Ability to learn and embrace new technologies
  • Familiar with the principles and concepts of open data
  • Comfort with risk and trying new things
  • Ability to work independently and as part of a small team
  • Enjoys collaborative processes and developing shared understanding
  • Strong organization skills

Technical/Knowledge Skills
  • 3+ years experience in related work
  • Experience in data manipulation and analytical thinking
  • Experience writing ETL/ELT code, expecially creating and deploying through a framework
  • Strong experience in Python and SQL

Bonus points if you have:
  • Experience training non-technical users to use technology to support their work
  • Strong quantitative analysis skills
  • Strong familiarity with geospatial data and best practices
  • Experience translating business needs into technical implementations, including mapping out business processes and data models
  • Experience working with a variety of databases, APIs, and formats to extract and transform data