Clearcover is the smarter car insurance company. We use powerful technology to offer everyday drivers better coverage for less money. We’re proud to be one of the fastest growing startups in Chicago, and we’re currently looking to add a few more extraordinary people to our team.
What is a Data Engineer?
Reporting to Braun Reyes, the Data Engineer is responsible for building, owning, monitoring, and maintaining our data and analytics ecosystem. This platform powers our business decisions and data product initiatives by enabling our analysts, engineers, and data scientists with the data access and tools they need to deliver value. Core responsibilities include extracting and ingesting raw data into our data warehouse/data lake and providing frameworks and services for operating on that data. Our data comes from multiple sources, including clickstream, change data capture, webhooks, sftp, and API in formats that include tabular, json, and xml.
What will you do?
- Build batch and real-time data pipelines into our Snowflake data warehouse using AWS serverless, open-source ELT (extract-load-transform) frameworks, and ETL as a service technologies.
- Create workflows to monitor and alert on service/job health and data quality.
- Deploy and own full set of cloud infrastructure created to support data management workflows.
- Develop custom utilities and components as needed to compliment existing tooling.
- Implement automated deployment of data pipelines using CI/CD best practices.
- Ensure privacy, compliance and security of data
- Participate in integrating our core data assets into the rest of the technical organization and Clearcover product suite.
- Provide technical mentorship to fellow Engineers and Analysts
- Advise and assist on application side data modeling and data science workflow design
What do you need?
- Fluency in Python for data extraction, transformation, and cloud service api interaction.
- Ability to write SQL for reporting, transformations, and data loading/extraction.
- Experience implementing, testing, debugging and deploying batch data pipelines using workflow management tools like Apache Airflow, Apache Azkaban, DBT, and/or Step Functions.
- Experience with managing and loading data into cloud data warehouse services like Snowflake and/or Redshift.
- Experience deploying containerized applications and workflows using Docker on platforms like Kubernetes and/or ECS/Fargate.
- Experience building real-time data pipelines using services like Alooma, Fivetran and/or technologies like Kinesis, Apache Kafka.
- Experience deploying and administering cloud DBAAS (Database as a Service) like RDS(Postgres, MySQL, Aurora) and/or Dynamodb.
- Ability to automate cloud infrastructure provisioning via Terraform and/or Cloudformation.
- Solid understanding of CI/CD workflows for data pipelines/services.
Nice to haves?
- Interest in writing about technical solutions developed by Clearcover Data Engineering.
- Experience with deploying data science workflows and/or services using frameworks like AWS Sagemaker leveraging Machine Learning, Deep Learning, and Computer Vision.
- Experience with the AWS Serverless stack (Lambda, SNS, SQS, Kinesis, Step Functions, Fargate, Dynamodb).
- Familiarity with BI tools such as: Periscope, Tableau, and Looker
What's in it for you?
- Unlimited vacation, we hire adults
- Equity for all employees, so you own a piece of the pie too
- Dental and Vision, we've got you covered 100%
- Medical, we cover 90% of your premium, 75% of your dependents and contribute to your HSA and HRA (cha-ching)
- We invest in your future by contributing 3% of your salary to a 401(K), even if you don't
- Come to work pre-taxed through our FSA commuter benefits
- and yes, we have unlimited LaCroix, beer, snacks and the occasional ice cream social