• Notera att ansökningsdagen för den här annonsen kan ha passerat. Läs annonsen noggrant innan du går vidare med din ansökan.

About the job
Do you want to be a part of a cutting edge AFRY team working with the latest technology within Big Data and Hadoop full stack development, in a project with the best developers in the region and with the most attractive customer right now? We are looking for developers with burning passion for complex development in a team where you will be able to contribute and learn from highly skilled team members. You will be working in an inspiring and challenging environment.
Responsibilities include maintaining and scaling production Hadoop, HBase, Kafka, and Spark clusters as well as implementation and ongoing administration of Hadoop infrastructure including monitoring, tuning and troubleshooting. Further responsibilities also include providing hardware architectural guidance, plan and estimate cluster capacity, and create roadmaps for the Hadoop cluster deployment and improving scalability, service reliability, capacity, and performance.

Who are you?
We are looking for a solid Hadoop engineer focused on operations to administer/scale multipetabyte Hadoop clusters and the related services that go with it. This role focuses primarily on provisioning, ongoing capacity planning, monitoring, management of Hadoop platform and application/middleware that run on Hadoop. You have a tools-first mindset, you build tools for yourself and others to increase efficiency and to make hard or repetitive tasks easy and quick. In addition you are organized, focused on building, improving, resolving and delivering, and a good communicator in and across teams. 
You have a Bachelors or Master Degree in Computer Science or similar technical degree.
Qualifications:

Overall 10+ years of work-experience with at least 5+ years of Hadoop experience in production, in medium to large clusters.
Hands on experience with managing production clusters (Hadoop, Kafka, Spark, more).
Strong development/automation skills.
Must be very comfortable with reading and writing Python and Java code.
Experience with GIS, Geographical Data and Toolkits such as JTS, ArcGIS, QGIS,OpenJump etc.
Experience with Configuration Management and automation. 

Feedback may be delayed during the holidays.

We offer
We are looking for someone who wants to be part of AFRY’s success story. Are you passionate about technology development? Do you like to work together to find the best solution? Then we can offer you career opportunities in a modern workplace with challenging assignments and exciting projects all over the world.

The AFRY Group is ranked as one of Sweden’s most popular employer among engineers. At AFRY you will be involved in developing innovative and sustainable solutions within infrastructure, energy and industry. We are always looking for the sharpest skills that can create a future society together with us. We hope you will learn as much from us as we will learn from you.

About the company
AFRY is an international engineering, design and advisory company. We support our clients to progress in sustainability and digitalisation.

We are 17,000 devoted experts within the fields of infrastructure, industry and energy, operating across the world to create sustainable solutions for future generations.

Making Future

Detta är en jobbannons med titeln "Hadoop engineer" hos företaget ÅF Technology och publicerades på webbjobb.io den 15 januari 2020 klockan 15:15.

Hur du söker jobbet

webbjobb-logo-white webbjobb-logo-grey webbjobb-logo-black