MID Spark DeveloperMargo-Group
Full time
Warsaw
Part-time
Function
Data Engineering
Contract
Full time
Remote Work Policy
Part-time
Salary
Cooperation model: hybrid (remote + on-site in Warsaw office, 2 times per month) Required Skills (must have): - Minimum 2 years of professional experience in Spark - Technical background (IT/Engineering studies) - Solid understanding of Big Data concepts, Data Warehousing, and Data Management - Experience with Hadoop platforms (Cloudera/Hortonworks) - Knowledge of engineering best practices for large-scale data processing: design standards, data modeling techniques, coding, documenting, testing, and deployment - Hands-on experience with data formats: JSON, PARQUET, ORC, AVRO - Understanding of database types and usage scenarios (Hive, Kudu, HBase, Iceberg, etc.) - Advanced SQL skills - Experience integrating data from multiple sources - Familiarity with project/application build tools (e.g., Maven) Nice to Have: - Practical knowledge of Agile methodologies and tools (Jira, Confluence, Kanban, Scrum) - Experience with Kubeflow - Knowledge of streaming technologies such as Kafka, Apache NiFi - Familiarity with CI/CD automation processes and tools Why join us? - A stable long-term project within a major financial institution - Collaboration with highly skilled specialists in a large enterprise environment - Opportunity to work on high-impact projects in the banking industry with exposure to Machine Learning solutions
Job Description
Margo Consulting Poland is a consulting and technology company specializing in delivering top IT experts to leading financial and technology institutions across Europe.
We are looking for an experienced Spark Developer to join a dynamic Big Data team for one of our clients in the banking sector. The project focuses on large-scale data processing and machine learning implementation support. This is a great opportunity to be part of an innovative environment, collaborating closely with advanced analytics and data science experts across different business departments.
Margo Offers:
- Salary range per month or rate per day
- Ability to work in an international consulting company on ambitious projects,
- Permanent contract or B2B cooperation,
- Benefits such as medical care and sports card,
- Co-finantrainings, certification exams and post-graduate studies,
- Internal training and the possibility of using our know-how,
- Possibility to use our library free of charge,
- Individual approach and development opportunities (career path planning, ability to change the project and position, possibility to get involved in outside-project activities with additional remuneration),
- Possibility to influence the shape of the company, openness to your ideas and willingness to implement them,
- Excellent working atmosphere, integration events.
Cooperation model: hybrid (remote + on-site in Warsaw office, 2 times per month)
Required Skills (must have):
- Technical background (IT/Engineering studies)
- Solid understanding of Big Data concepts, Data Warehousing, and Data Management
- Experience with Hadoop platforms (Cloudera/Hortonworks)
- Knowledge of engineering best practices for large-scale data processing: design standards, data modeling techniques, coding, documenting, testing, and deployment
- Hands-on experience with data formats: JSON, PARQUET, ORC, AVRO
- Understanding of database types and usage scenarios (Hive, Kudu, HBase, Iceberg, etc.)
- Advanced SQL skills
- Experience integrating data from multiple sources
- Familiarity with project/application build tools (e.g., Maven)
Nice to Have:
- Experience with Kubeflow
- Knowledge of streaming technologies such as Kafka, Apache NiFi
- Familiarity with CI/CD automation processes and tools
Why join us?
- Collaboration with highly skilled specialists in a large enterprise environment
- Opportunity to work on high-impact projects in the banking industry with exposure to Machine Learning solutions
Location
Warsaw, Pologne
Function
Data Engineering
Contract
Full time
Remote Work Policy
Part-time
Salary
Cooperation model: hybrid (remote + on-site in Warsaw office, 2 times per month) Required Skills (must have): - Minimum 2 years of professional experience in Spark - Technical background (IT/Engineering studies) - Solid understanding of Big Data concepts, Data Warehousing, and Data Management - Experience with Hadoop platforms (Cloudera/Hortonworks) - Knowledge of engineering best practices for large-scale data processing: design standards, data modeling techniques, coding, documenting, testing, and deployment - Hands-on experience with data formats: JSON, PARQUET, ORC, AVRO - Understanding of database types and usage scenarios (Hive, Kudu, HBase, Iceberg, etc.) - Advanced SQL skills - Experience integrating data from multiple sources - Familiarity with project/application build tools (e.g., Maven) Nice to Have: - Practical knowledge of Agile methodologies and tools (Jira, Confluence, Kanban, Scrum) - Experience with Kubeflow - Knowledge of streaming technologies such as Kafka, Apache NiFi - Familiarity with CI/CD automation processes and tools Why join us? - A stable long-term project within a major financial institution - Collaboration with highly skilled specialists in a large enterprise environment - Opportunity to work on high-impact projects in the banking industry with exposure to Machine Learning solutions
Location
Warsaw, Pologne