Talent.com
This job offer is not available in your country.
Medior Data Engineer @ NIX Tech Kft.

Medior Data Engineer @ NIX Tech Kft.

NIX Tech Kft.Budapest, Hungary
13 days ago
Job description

We are looking for a  Data Engineer to join our team and take part in building and maintaining data pipelines using modern technologies. You will work on cloud-based solutions (AWS, GCP, Azure), contributing to data integration, transformation, and delivery processes across diverse client projects. The role involves applying established best practices to ensure performance, reliability, and scalability of data workflows.

WHAT WE OFFER :

  • Competitive compensation packages.
  • Stable employment, based on a full-time employment contract.
  • Private health insurance (Medicare Сlinic).
  • AYCM sport pass, providing discounts at various sports facilities in Hungary.
  • Interesting tasks and diverse opportunities for developing your skills.
  • Free training courses, including English.
  • Participation in internal and external thematic events, technical conferences.
  • A spacious office in the heart of Budapest (13th district).
  • All necessary devices and tools for your work.
  • Active corporate life.
  • The friendly and supportive atmosphere within the team.

If you feel you’re ready to join the team, apply for this job now! We’re already looking forward to meeting you!

  • 2+ years of experience in  data engineering
  • Proficient in  Python ; knowledge of  Scala  is a plus.
  • Experience with  Databricks ,  Snowflake , or other modern data platforms.
  • Strong command of  SQL  and familiarity with  relational databases  (PostgreSQL, MySQL, SQL Server).
  • Exposure to one or more cloud platforms, ideally with services such as :
  • AWS : Glue, Athena, Lambda, DMS, ECS, EMR, Kinesis, S3, RDS

  • GCP : Dataflow, BigQuery, Cloud Functions, Datastream, Pub / Sub, Dataproc, Dataprep
  • Azure : Data Factory, Synapse Analytics, Azure Functions, Data Explorer, Event Hubs, Data Wrangler
  • Nice to Have

  • Experience with  Delta Lake ,  Apache Iceberg , or other lakehouse formats.
  • Familiarity with  NoSQL databases  (e.g., MongoDB).
  • Knowledge of  CI / CD for data pipelines , version control, testing frameworks.
  • Comfort working with containerized or serverless environments (e.g.,  ECS ,  Cloud Run ,  AKS ,  Lambda ).
  • Hands-on experience with  dbt Cloud / Core and integration into data workflows.
  • We are looking for a  Data Engineer to join our team and take part in building and maintaining data pipelines using modern technologies. You will work on cloud-based solutions (AWS, GCP, Azure), contributing to data integration, transformation, and delivery processes across diverse client projects. The role involves applying established best practices to ensure performance, reliability, and scalability of data workflows.

    WHAT WE OFFER :

  • Competitive compensation packages.
  • Stable employment, based on a full-time employment contract.
  • Private health insurance (Medicare Сlinic).
  • AYCM sport pass, providing discounts at various sports facilities in Hungary.
  • Interesting tasks and diverse opportunities for developing your skills.
  • Free training courses, including English.
  • Participation in internal and external thematic events, technical conferences.
  • A spacious office in the heart of Budapest (13th district).
  • All necessary devices and tools for your work.
  • Active corporate life.
  • The friendly and supportive atmosphere within the team.
  • If you feel you’re ready to join the team, apply for this job now! We’re already looking forward to meeting you!

    Design, build, and maintain robust ETL / ELT pipelines using tools such as Databricks, Apache Spark, dbt, and Snowflake., Ingest and process data from APIs, message queues, relational databases, and files into a cloud-based data platform., Schedule and orchestrate workflows using Apache Airflow or cloud-native alternatives., Enable querying and analytics on large-scale data using data warehouses and SQL engines (e.g., Snowflake, BigQuery, Redshift)., Operate across major cloud platforms (AWS, GCP, Azure) and leverage their native data engineering services.] Requirements : Python, Databricks, Snowflake, SQL, Relational databases , AWS, GCP, Azure, Data Engineer, Scala, Delta Lake, Apache Iceberg, NoSQL databases , CI / CD, ECS, Cloud Run, AKS, Lambda, dbt Cloud / Core Tools : Agile, Scrum, Kanban. Additionally : International projects, Paid English courses, Mentoring program, Bike parking, Free coffee, Playroom, Shower, Free snacks, Modern office, No dress code.

    Create a job alert for this search

    Data Engineer • Budapest, Hungary