DevOps Engineer

Posted Feb 3

About the job

Flow Labs is Leading the AI Revolution in Transportation

At Flow Labs, we’re harnessing AI and unprecedented data to transform how the world moves. Our platform captures more real-time transportation data than ever before, feeding powerful AI models that optimize and automate how traffic systems operate.

We’re not just theorizing about the future—we’re building it now. The Flow Platform is already deployed in dozens of cities, optimizing thousands of traffic signals and managing hundreds of thousands of miles of roadway. And we’re scaling fast.

Our AI-driven platform enables transportation agencies to proactively monitor, analyze, and optimize their roadways, making real-time decisions that reduce congestion, lower emissions, and save lives. The results? Travel times cut by 24%, emissions reduced by 21%, and a path to a more efficient, sustainable, and intelligent transportation network.

This is a once-in-a-generation opportunity to redefine an industry. If you’re a software engineer who wants to build at scale, tackle complex AI and big data challenges, and make a real-world impact—we want you on our team.

About You

We're looking for an experienced DevOps engineer to join our team and help us build the future of traffic management. The ideal candidate:

  • Is excited about making an impact with their work
  • Is looking for an environment where difficult real-life problems are being tackled
  • Enjoys working autonomously, with supportive & collaborative teammates from different disciplines
  • Has experience building cloud and bare-metal infrastructure to support large data sets
  • Has experience designing systems to improve the productivity of other engineers on their team

Role and Responsibilities

We are a small nimble team tackling challenging problems, so there will be many opportunities to build and take ownership over complex systems that are integral to our platform. Your primary responsibilities will include:

  • building and maintaining big-data infrastructure on a combination of AWS and bare-metal in a way that addresses business needs and tradeoffs
  • Build and maintain development and deployment infrastructure to support other engineers on the team
  • Install, Scale, and Maintain distributed database systems, currently PostgreSQL/TimescaleDB
  • Help to plan product development from the lens of performance and scalability of our data pipelines

Desired Qualifications

  • High degree of professional software development experience
  • Fluent experience with Kubernetes on AWS and bare-metal
  • Experience with bare-metal infrastructure, either professionally or in a robust homelab setting
  • Fluency in Python, Javascript, and/or Golang
  • Strong fluency with Postgres administration, TimescaleDB plugin experience would be a nice addition

Why Flow Labs?

  • Work with cutting edge technology including Javascript, GraphQL, Mapbox, hardware accelerated overlay onto maps, Connected Vehicle Data, Autonomous Vehicles, and TimescaleDB
  • Solve and learn about sophisticated engineering, scientific, and mathematical problems including cutting edge machine learning and AI (artificial neural networks, reinforcement learning, modeling and simulations), rigorous data analytics and statistics, big data processing, optimization and cost management, edge work and non-standard UI, and Kubernetes management
  • Contribute to solving a global problem that impacts climate, sustainability, safety, health, economic vitality, equity, and resilient cities
  • Work with a collaborative, supportive, and diverse team that prioritizes meritocracy, taking action, teaching co-workers new skills, work/life balance, flexibility, and humor
  • Get ample opportunities to grow, learn, and contribute

Bonus Points!

Technologies and techniques you're likely to work on are listed below. It's helpful (though optional) if you have experience with some of these as well:

  • GitOps experience (Actions, etc)
  • Big Data experience, managing 10's of Terrabytes of actively used data up to Petabytes of actively used data
  • Experience working with geospatial or timeseries data sets