Job offer

< Back

DevOps/Data Engineer

Salary:
1300-1600 PLN net+vat/MD
Type of employment:
B2B
Date:
2026.02.05
Location:
Kraków

At Mindbox we connect top IT talents with technology projects for leading enterprises across Europe. 

 

As a Data Engineer, you will design and implement robust, scalable, and reusable data solutions, working on ETL/ELT pipelines, automation, and analytics to support security initiatives. This role involves close collaboration with stakeholders, technical analysis, and delivering high-quality solutions in an Agile environment.

 

 

Sounds like your kind of challenge? 


What you'll be doing

  • Design, develop, and maintain data pipelines and ETL/ELT processes using Python.
  • Implement and optimize workflows using Apache Airflow or similar frameworks.
  • Work with Big Query and/or Azure Databricks for data processing and analytics.
  • Translate detailed designs into scalable, reusable solutions ensuring performance and security.
  • Perform technical analysis for projects, changes, and production implementations.
  • Collaborate with stakeholders to influence design decisions and achieve desired outcomes.
  • Apply best practices in release management, quality control, and DevOps.
  • Contribute to continuous improvement through automation and efficient processes.
  • Work within Scrum/Agile methodologies and participate in design, development, and testing phases.

 

Note: Detailed project information will be shared during the recruitment process. 


What you get in return

  • Flexible cooperation model – choose the form that suits you best
    (B2B, employment contract, etc.)
  • Hybrid work setup – remote days available depending on the client’s arrangements (6 days per month in Kraków)
  • Collaborative team culture – work alongside experienced professionals eager to share knowledge 
  • Continuous development – access to training platforms and growth opportunities 
  • Comprehensive benefits – including Interpolska Health Care, Multisport card, Warta Insurance, and more 
  • High quality equipment – laptop and essential software provided 

Who we're looking for

Must-have:

  • Hands-on experience in data pipelines and ETL/ELT processes using Python.
  • Strong knowledge of ETL frameworks (Apache Airflow, Pandas, PySpark, Dagster).
  • Proficiency in Python development (Numpy, Pandas, Keras).
  • Practical experience with Apache Airflow and Big Query / Azure Databricks.
  • Familiarity with GIT and efficient branch management.
  • Strong experience in data analysis and developing production-grade code.
  • Proven track record in delivering enterprise-grade technical solutions.
  • Experience working in Agile/Scrum environments.
  • Excellent communication skills in English (written and verbal).

Nice-to-have:

  • Hands-on experience with DevOps practices and tools.
  • Background in financial services or large-scale enterprise environments.

 

Joining this project you’ll become part of Mindbox – a tech-driven company where consulting, engineering, and talent meet to build meaningful digital solutions. We’ll back you up every step of the way, accelerate your development, and ensure your skills make a difference. 


Ready to take the next step?

Submit your application! We look forward to reviewing your profile 😊 

Know someone who might be a great fit? 
Feel free to share this opportunity using the referral link: Mindbox Referrals System