Back to Job Search

Senior Data Engineer Snowflake

Job Description

About the job

Established in 2007, my client, now a 250+ strong team from 50 nationalities across 15 countries, specializes in serving the unique needs of microbusinesses. Their diverse portfolio includes a GDPR-compliant AI-powered website builder, online store, logo creator, legal text generator, business listings, social media integration, SEO, and more. Recently, they introduced the first all-in-one solution of its kind in Germany, streamlining the journey from idea to success for those starting their self-employment. Constantly innovating, they aim to simplify and support microbusiness owners, with more exciting products and services on the horizon.

The role:

As a Senior Data Engineer, you'll lead the design, development, and maintenance of a dynamic data platform, collaborating with analytics engineers and analysts. Your role involves creating scalable and reliable data solutions, focusing on data pipelines, ETL processes, and foundational infrastructure. The evolving Data Platform recently shifted from Redshift to Snowflake, maintaining a robust AWS presence. Key technologies include Python, SQL, DBT, and Airflow for data pipelines, with Kubernetes, Docker, GitHub Actions, and Terraform forming the core infrastructure.

Your Responsibilities:

* Design, implement, and maintain data infrastructure and systems supporting organizational data initiatives.

* Prototype and develop machine learning capabilities.

* Enhance and maintain real-time event messaging and tracking.

* Infuse software engineering best practices in data, emphasizing testing, monitoring, and a portable, reusable codebase.

* Contribute to establishing data contracts ensuring end-to-end data quality.

Your Qualifications:

* Bachelor's or Master's degree in Computer Science, Software Engineering, or a related field.

* Robust software engineering background, particularly in backend development.

* Proficiency in Python is a must; familiarity with Rust, Kotlin, Java, or Go is advantageous.

* Experience with event streaming platforms like Kafka or similar platforms (Kinesis, Dataflow).

* Knowledge of event-tracking architecture (Snowplow, Rudderstack) is beneficial.

* Familiarity with data platforms like Snowflake (or equivalent) and data modeling tools like DBT.

* Proficiency in software engineering practices, including Docker, CI/CD, and testing.

* Proactive in learning new technologies and languages.

* Strong communication skills for collaboration with technical and non-technical stakeholders.


🌍 Meaningful Mission: Our core mission is empowering solopreneurs and small businesses, contributing to community sustainability.

⏰ Flexibility and Trust: Enjoy a FULLY REMOTE IN EUROPE work setup with opportunities for in-person collaboration, prioritizing results over hours.

💼 Support for Side Hustles: Encouraging side projects with up to three free websites for personal or professional use.

🌐 International Diverse Team: A team of over 248 individuals from 50+ countries working across 15+ diverse locations, promoting openness and inclusivity.

📈 Continuous Growth: Investment in your development through company-wide access to LinkedIn Learning and other initiatives, fostering career progression.

If you're interested, get in touch to find out more: