Creating an inspiring place to thrive for the talented, we use their expertise and courage to introduce the technology of the future into your business. - This is the foundation of Mindbox and the goal of our business and technology journey. We operate and develop in four areas:
🤖 Autonomous Enterprise - automation of business processes using RPA, OCR, and AI.
🌐Business Managment Systems ERP - we implement, adapt, optimize, and maintain flexible, safe, and open ERP of production and distribution companies worldwide.
🤝Talent Network - we provide access to the best specialists.
☁️ Modern Architecture - we build integrated, sustainable, and open CI / CD environments based on containers enabling safe and more frequent delivery of proven changes in the application code.
We treat technology as a tool to achieve a goal. Thanks to our consultants' reliability and proactive approach, initial projects usually become long-term cooperation. For over 16 years, it has provided various services to support clients in digital transformation.
#LI-Hybrid
We are looking for Data Engineers to join the IT team within the Environmental, Social & Governance department of the Data and Analytics office. The engineering team is responsible for taking business logic/poc asset designs and using them to create robust data pipelines using spark in scala. Our pipelines are orchestrated through airflow and deployed through a Jenkins based CICD pipeline. We operate on a private GCP instance and an on-premises Hadoop cluster. Engineers are embedded in multi-disciplinary teams including business analysts, data analysts, data engineers and software engineers and architects.
MUST have skills
ü Scala
ü Experience designing and deploying large scale distributed data processing systems with few technologies such as PostgreSQL or equivalent databases, SQL, Hadoop, Spark, Tableau
ü Proven ability to define and build architecturally sound solution designs.
ü Demonstrated ability to rapidly build relationships with key stakeholders.
ü Experience of automated unit testing, automated integration testing and a fully automated build and deployment process as part of DevOps tooling.
ü Must have the ability to understand and develop the logical flow of applications on technical code level
ü Strong interpersonal skills and ability to work in a team and in global environments.
ü Should be proactive, have learning attitude & adjust to work in dynamic work environments.
ü Exposure in Enterprise Data Warehouse technologies
ü Exposure in a customer facing role working with enterprise clients.
ü Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA.
Technologies must have:
Hadoop, Hive, HDFS, Apache Spark, Scala, GCP, Jenkins, Airflow, SQL