Job offer

< Back

Data Engineer

Salary:
1400 - 1650 pln/md + vat
Type of employment:
B2B
Date:
2025.04.15
Location:
Kraków
Offer
  • We are open to the employment form according to your preferences 
  • Work with experienced and engaged team, willing to learn, share knowledge and open for growth and new ideas 
  • Hybrid 1x/week from the office in Kraków / 4x remote
  • Mindbox is a dynamically growing IT company, but still not a large one – everybody can have a real impact on where we are going next
  • We invest in developing skills and abilities of our employees
  • We have attractive benefits and provide all the tools required for work f.e. computer
  • Interpolska Health Care, Multisport, Warta Insurance, training platform (Sages) 

Creating an inspiring place to thrive for the talented, we use their expertise and courage to introduce the technology of the future into your business. - This is the foundation of Mindbox and the goal of our business and technology journey. We operate and develop in four areas:

🤖 Autonomous Enterprise - automation of business processes using RPA, OCR, and AI.

🌐Business Managment Systems ERP - we implement, adapt, optimize, and maintain flexible, safe, and open ERP of production and distribution companies worldwide.

🤝Talent Network - we provide access to the best specialists.

☁️ Modern Architecture - we build integrated, sustainable, and open CI / CD environments based on containers enabling safe and more frequent delivery of proven changes in the application code.

We treat technology as a tool to achieve a goal. Thanks to our consultants' reliability and proactive approach, initial projects usually become long-term cooperation. For over 16 years, it has provided various services to support clients in digital transformation.

#LI-Hybrid

 

Tasks
  • Work with a collaborative team of varied disciplines, skills, and experience.
  • Working on a new project to enable existing platform on another Cloud provider.
  • Analyse existing GCP and BigQuery based solutions.
  • Design and implement Azure Databricks based solutions.
  • Work on integration mechanisms for copying large volumes of data between Cloud providers.
  • Build and execute complex ETL workflows on Azure based platform with extensive use of Azure Databricks.
  • Work on automation tool for Big Data artefacts migration between Cloud providers.
  • Generate synthetic data for scaled performance testing.
  • Use python programming language and variety of OSS for implementing smooth E2E migration process and utilities.
  • Work closely with developers, product owners, and other stakeholders to ensure quality standards.
  • Identify performance bottlenecks and optimising system performance.
Requirements

•    Experience working with at least one Cloud provider, preferably MS Azure.
•    Experience working complex ETL pipelines.
•    Deep understanding of Big Data technologies, ideally Spark SQL and/or BigQuery SQL.
•    Experience working with Databricks data warehouse, ideally Azure Databricks.
•    Experience in Python coding.
•    Good understanding of CI/CD process and automation tools.
•    A strong understanding of the technical architecture of complex ETL solutions.
•    Seasoned problem-solving skills and flexibility.
•    Ability to work under time pressure.
•    Experience working in Agile teams.
•    Strong self-organisation and inter-team communication.