hubQuest
Távmunka
2.6M – 3.2M HUF vállalkozói havonta+ÁFA
Elvárások
- Python
- Azure
- Data engineering
- PySpark
- Databricks
- Azure Data
- SQL
- Lengyel (C1)
- Angol (C1)
Előnyt jelentő készségek
- dbt
- Terraform
- Docker
- Kubernetes
A mi követelményeink
Eredeti szöveg. Fordítás megjelenítése
What we expect
- 5+ years of professional experience as a Data Engineer or Software Engineer in data-intensive environments
- Strong Python development skills, with solid understanding of OOP, modular design, and testing (unit/integration)
- Experience with PySpark and distributed data processing frameworks
- Hands-on experience with Azure Data ecosystem, including Databricks, Data Factory, Synapse, and serverless compute
- Solid knowledge of SQL and database performance optimization
- Experience in CI/CD and DevOps practices for data pipelines (GitHub Actions, Azure DevOps, or similar)
- Proven ability to refactor complex systems and implement scalable, automated solutions
- Experience in data testing, validation, and observability frameworks
- Strong communication skills and the ability to work independently in a global, collaborative team environment
- Fluent English
Nice to have
- Experience with DBT (Data Build Tool) for transforming and testing data in analytics pipelines
- Experience with Terraform or other Infrastructure as Code tools
- Familiarity with containerization and orchestration (Docker, Kubernetes)
- Understanding of data governance and metadata management principles
- Experience in multi-tenant or multi-market system design
Pozíció / projekt rövid leírása
Eredeti szöveg. Fordítás megjelenítése
We are a team of experts bringing together top talents in IT and analytics. Our mission is to build high-performing data and technology teams — from the ground up or by scaling existing ones — to help our partners become truly data-driven organizations.
Currently, we are looking for a Senior Data Engineer to join the Global Analytics Unit, a global, centralized team driving data-driven decision-making and developing smart data products that power everyday operations across markets.
The team’s essence is innovation — we foster a data-first mindset across all business areas, from sales and logistics to marketing and procurement. As the team expands its analytics solutions globally, we’re seeking a hands-on engineer who combines strong software craftsmanship with a passion for building scalable, automated data systems.
Why join us
If you want to:
- Work on complex, large-scale data systems with global impact
- Build robust and scalable data pipelines using modern cloud-native architectures
- Contribute to innovative projects and see your ideas implemented
- Work in a diverse global team of top-tier engineers and data professionals
- Have the freedom to shape tools, technologies, and processes
- Operate in a culture that values autonomy, collaboration, and technical excellence
- …then this role is for you.
We also offer:
- Flexible working hours and remote work options
- A relaxed, non-corporate environment with no unnecessary bureaucracy
- Private medical care and Multisport card
- A modern office in central Warsaw with great transport links
- Access to global learning resources, certifications, and knowledge exchange
In short
We’re looking for a software-minded Data Engineer — someone who writes clean, testable Python, designs systems that scale globally, and loves automating everything that can be automated. You’ll have a real impact on the architecture and delivery of global analytics solutions used daily across multiple markets.
Napi feladatok
Eredeti szöveg. Fordítás megjelenítése
- Design, develop, and maintain end-to-end data pipelines and architectures for large-scale analytics solutions
- Refactor code and REST services to support dynamic OpCo deployments and multi-tenant scalability
- Develop zero-touch deployment pipelines to automate infrastructure and environment provisioning
- Implement data validation and testing frameworks ensuring reliability and accuracy across data flows
- Integrate new pipelines into a harmonized data execution portal
- Build and maintain serverless and Databricks-based data processing systems on Azure
- Design and optimize ETL/ELT workflows in Python (including PySpark)
- Implement Infrastructure as Code (IaC) for reproducible deployments using tools like Terraform
- Collaborate closely with Data Scientists, Architects, and Analysts to deliver production-grade data products
- Troubleshoot, monitor, and improve data pipelines to ensure performance and resilience
mutass kevesebbet
Specifikációk
- Online állásinterjú
- Toborzás nyelvei: angol&lengyel
- Azonnali kezdés
- Távmunka
- Rugalmas munkaidő
- Nem jár utazással
- Főleg új funkciók
Biztosított eszközök
- Apple
- Windows
- Számítógép: Notebook
- as many as you need 🙂
Metodológia
- Agile managementScrum, Agile
- Issue tracking tool
- Knowledge repository
- Code reviews
- Version control system
- Cloud infrastructureAzure
- Munkaeszközök kiválasztásának lehetősége
Irodán belüli juttatások
- Ingyenes kávé
- Kerékpártároló
- Ingyenes italok
- Belső képzések
- Belső hack napok
- Modern iroda
- Nincs dress code
Extrák
- Sportcsomag
- Magánegészségügyi ellátás
- Képzési költségvetés
- Kis létszámú csapat
- Nemzetközi projektek
 
				





