- Notera att ansökningsdagen för den här annonsen kan ha passerat. Läs annonsen noggrant innan du går vidare med din ansökan.
Reporting to: Line Lead, Big Data
Department: Big Data
Hours of work: Fulltime
We are looking for someone to take on the role as BigData Dev-ops engineer for our BigData team.
This is a great opportunity for you who is passionate about and has experience with building and maintaining an Enterprise BigData Platform and solution. The goal is to make sure we have a top-class BigData platform technically and when it comes to 24/7 platform available and application deployment.
The BigData team is one of three teams in the Data department. The other two are BI & DWH team. The team consists of 8 people based in Stockholm, responsible for building and maintaining a BigData platform and services in HDP,Datastax, ElasticSearch. We work with product owners who helps us to work according to the right business priorities.
The BigData is a key team for the Unibet group that is being used across different groups. We have an increased focus on the data area and there are lots of interesting challenges coming up related to scaling up, finding new ways of doing things, how to use new data technologies and enable data as a revenue driver for the company to a larger extent.
Key accountabilities of this role are
Design, implement and maintain enterprise-level security (Kerberos, Active Directory, etc.)
Troubleshoot Hadoop related applications, components and infrastructure issues at large scale
Create run books for troubleshooting, cluster recovery and routine cluster maintenance
3rd-Level-Support (DevOps) for business-critical applications and use cases
Closely working with infrastructure, network units
Cluster node configuration, connectivity, capacity, name node/data node/job tracker deployment layout, server requirements, SAN, RAID configurations etc.
We're looking for someone with the following skills, knowledge and experience
Excellent hands-on working experience with Hadoop ecosystem for at least 4 years including HDFS, kafka, Spark, Splunk, Pig, Hive, Oozie, Zookeeper, Flume
Excellent hands-on working experience with Python/Shell Scripting and Linux/Unix
Well versed in installing, upgrading & managing distributions of Hortonworks, Cassandra, Elastic Search.
Experience in Managing large-scale, secure, highly-available Hadoop infrastructure supporting rapid data growth for multiple internal customers. Install operating system and Hadoop updates, patches, Version upgrades
To apply for this role click on the "Apply for this role" button,complete the short web form and attach a CV with a cover letter stating why you would be suitable for the position and the recruitment team will be in touch shortly.
For details on the benefits package available please click here