WE'RE LOOKING FOR
We're looking for a Data Engineer to help shape the digital experiences that define our products. In this role, you will be responsible for the end-to-end software development life cycle, from concept to completion. You will help the teams to collect, move and store reliably the data from connected devices. With us, you’ll work with top-notch technologies utilizing cloud providers like AWS.
Responsibilities
- Design, develop and test batch &real-time data processing solutions and ETL/ETL pipelines
- Build required infrastructure with DevOps teams for optimal extraction, transformation, and loading of data from various sources
- Ensure the quality, reliability, and performance of data pipelines through testing and monitoring
- Collaborate with engineers, architects, data scientists, and business stakeholders across multiple client projects
- Adapt to different client technology stacks and data platform requirements
Technical skills
MUST HAVE
- Ability to write well-structured, tested and maintainable code, preferable in Python
- Understanding of data modelling, data warehousing concepts and ETL/ELT patterns
- Hands on experience with AWS streaming and storage services (S3, Kinesis, Glue, Lambda, Redshift, RDS, DynamoDB)
- Strong knowledge of SQL and experience with both relational and NoSQL databases
- Team-oriented mindset with strong communication skills
- Experience working in an agile, international surrounding
- Strong analytical thinking and structured problem-solving abilities
NICE TO HAVE
- Experience with modern cloud data platform: Snowflake, Databricks, Redshift, Biguery
- Knowledge of data transformation tools: Matillion, dbt, Fivetran
- Experience with workflow orchestration: Apache Airflow
- Familiarity with BI tools: Power BI, MicroStrategy, Qlik, QuickSight, Looker
Methodology
- Testing; 70% coverage, TDD, Jest
- Agile Management: Scrum, Agile
- Issue tracking tool: Jira
- Knowledge repository: Confluence
- Version control system: GIT
- Code reviews: Github