Data Engineer
About Akkodis:
Akkodis is a global leader in the engineering and R&D market that is leveraging the power of connected data to accelerate innovation and digital transformation.
With more than 50 000 engineers and digital experts in 30 countries around the world, we offer broad industry experience, and strong know-how in key technology sectors such as mobility, software & technology services, robotics, testing, simulations, data security, AI & data analytics. Our pioneering approach empowers businesses to explore, innovate, and accelerate new possibilities while creating a dynamic culture for our people.
Our people are the foundation of our success. That’s why we champion a company culture where talent is celebrated, and diversity is embraced.
Position Highlights
We are seeking a skilled and experienced Data Engineer to join our team. The ideal candidate will be deeply involved in discovering, analyzing, and assembling large and complex data sets. They will design and build our big data infrastructure, ensure data security, and optimize the performance of our big data platforms, primarily within the AWS cloud environment.
Main Responsibilities:
- Discover, analyze, and assemble large, complex data sets to meet functional and non-functional business requirements
- Design, build, optimize, and maintain robust 'big data' data pipelines, architectures, and datasets
- Build scalable ETL pipelines for various data sources using Python in the AWS ecosystem
- Engage with AWS services including AWS Lambda, Amazon EventBridge, Amazon S3, Amazon Kinesis, and AWS Step Functions
- Implement and manage ELK stack or OpenSearch-based platforms
- Ensure robust data security measures and access controls are implemented across our AWS data infrastructure
- Collaborate with team members to improve existing systems, streamline processes, and drive data-related initiatives
- Understand and leverage the architecture of data warehouses for optimal performance
- Troubleshoot and resolve data issues, ensuring data accuracy and reliability
- Collaborate effectively with cross-functional teams, presenting data findings and insights when required
Requirements:
- At least 3 years experience in building, optimizing, and maintaining big data pipelines and architectures
- Strong proficiency in Python
- Hands-on experience with ETL pipeline construction in an AWS environment
- Familiarity with AWS services such as AWS Lambda, Amazon EventBridge, Amazon S3, Amazon Kinesis, and AWS Step Functions
- Experience with ELK stack or OpenSearch platforms
- Solid understanding of data security and implementing access controls in the AWS cloud platform
- A clear grasp of Data Warehouse architecture concepts
- Exceptional problem-solving, analytical skills, and a knack for troubleshooting complex data challenges
- Strong interpersonal skills with a demonstrated ability to communicate data findings clearly to both technical and non-technical audiences
- Team player with the ability to collaborate effectively in a diverse environment
- AWS certification(s) in relevant services is a plus
You will get:
- Competitive remuneration package
- Referral bonus program
- 25 days of annual paid leave
- Additional health insurance (outpatient & hospital medical care, dental care, coverage of dioptric glasses, and more)
- Free Psychological Counselling via Green line and on the spot
- Newborn or newly adopted child bonus
- Food vouchers - 150 BGN/month
- Upskilling & reskilling training programs and e-learning hub
- Diverse career growth opportunities
- Recognition awards
- Sports cards (partially covered by the employer) and company sports initiatives
- Special company discounts
- Various social and charity initiatives
United by our passion for talent and technology, we look at the world differently.
The future won’t wait, it’s time to make incredible happen. Are you ready?
Job Segment:
Cloud, Test Engineer, Testing, Data Warehouse, Construction, Technology, Engineering