We are hiring!
Open Job positions
As a data engineer you will:
- Design and optimize our solution data model
- Integrate data from a wide variety of data sources
- Build and maintain scalable data pipelines in a cloud-computing environment
- Create data tools for other data team members
- Create efficient integrations with other stack applications
- Test and recommend new technologies
- Create the data architecture of the solution
- Create software development best practices
- Have a key impact on product development and real influence on the company
You are the perfect fit if you have:
- Strong proficiency in programming in Python
- Strong experience with data orchestration tools (preferably Airflow)
- Experience in writing advanced SQL queries and leveraging core logic skills to transform data
- Proven, hands-on Software Development experience in data related developments
- Familiarity with cloud technologies (eg. GCP, AWS, Azure)
- Ability to speak English freely
You will earn extra points for:
- Experience with BigQuery
- Expertise with container technologies (Docker, Cloud Run, Kubernetes) and CI pipelines (Cloud Build, GitLab CI)
- Familiarity with IaaC tools (preferably Terraform)
- Experience working with Google Cloud Platform
We would like to offer you:
- Working with cutting edge technologies in analytics and data engineering
- Technical challenges that requires out of the box thinking
- Startup atmosphere but not working hours
- Hybrid working model and flexible working hours
- 15 000 – 30 000 zł salary (B2B), based on experience
- Office in an old tenement house in the city center (pl. Unii Lubelskiej)
- Small team with zero bureaucracy
About us.
We are a young, fast-growing SaaS company specializing in Data and Analytics Engineering. Our main product is a cutting-edge analytical platform for companies with recurring revenue models (DTC, SaaS, Subscriptions etc.). We currently provide services to clients from Poland and the USA.
By joining us, you will gain a unique opportunity to shape our technology stack, product and way of working.
Currently we are using BigQuery as a Data Warehouse, which is fueled through tailor-made ELT jobs orchestrated by Airflow. Platform frontend is hosted using Cloud Run, platform backend uses Cloud SQL, Redis (soon) and Compute Engines. Our infrastructure is managed with Terraform. DBT is used for data analytics.