• Notera att ansökningsdagen för den här annonsen kan ha passerat. Läs annonsen noggrant innan du går vidare med din ansökan.

We deliver where others fail
Agile Search offers high quality recruitment services focused on C-level positions and IT-specialists. We work with outreach and social recruiting methodology in order to reach the market’s best candidates. We are dedicated to our work with a strong focus on quality and on-time delivery.
We have a strong track record in recruitment to the professional services and IT industry. What we believe to be of utmost importance to succeed is a customer-oriented dialogue, a high energy level and a strong knowledge of both recruitment and the requirements needed for individuals to be successful in the professional services and IT industry.

Our client Kindred is one of the largest online gambling companies in the world with over 20 million customers across 100 markets. They are the pioneers in the online gambling industry and as an innovation driven company that builds on trust, they have led the development into areas such as technological advancements, mobile solutions, new product launches, as well as player safety and responsible gambling improvements.
You will be part of the Big Data team that is bringing state-of-the-art analysis techniques to Kindred. You will own and manage the Data Operations work stream that is responsible for provisioning, automation, maintenance and support of the Hadoop ecosystem.
They work in an Agile way with open communication, trust and compassion always striving to improve themselves, to learn from other members of the team and to share knowledge.
They are a fast paced international team working around five core tenets - they are individuals united to a common goal, they will challenge the status quo and push the boundaries, they are trusting and always friendly, they innovate through experimentation and exploration - always.
You will:

Provision the Hadoop cluster and complementary tools.
Monitor and maintain the ecosystem to guarantee uptime.
Automate deployments of application code and configuration changes.
Implement security and audit mechanisms to guarantee data security.
Support the Big Data deployment and handle incidents.
Manage a team of exceptionally talented Dev Ops engineers.
Continuously improve performance of the team and the technology stack.
Bring world-class knowledge on processes to ensure data quality.
Integrate the Big Data platform with their existing Data Warehouse.
Work closely with development teams, data warehouse teams and other business stakeholders.
Impart best practice learnings to the wider Big Data team.

You (and your team) will be responsible for:

Keeping the deployment running with less than 5% downtime in production.
Respond to incidents according to the corporate SLA.
Interface with 3rd parties to ensure fixes are delivered quickly and robustly.
Gatekeeping the access to and usage of the big data platform.
Ensuring your team is sufficiently motivated and focused on delivering the best quality.
Nurturing team members and helping them achieve job satisfaction.

Required qualifications:

Strong experience in Linux system administration and Hadoop platform setup, monitoring, maintenance and support. Received HDP CERTIFIED ADMINISTRATOR or CCA Administrator certification.
Clear hands-on experience of using Kerberos, LDAP and Active Directory.
Experience in programming in Bash and Python, and preferably Java and Ruby.
Experience in building and deploying application on Spark, Storm, Kafka, HBase and Cassandra.
Experience using Splunk for system monitoring.
Deep understanding of real-time data processing concepts with knowledge of the industry best practices.
A keen mind with an appetite for problem solving.
A demeanour that is personable and articulate with good sense of managing a team.
Managed a distributed Dev Ops team, preferably worked with an outsourcing company.
Passion for open source technologies and a desire to apply them to large volumes of data.

Desired qualifications:

Received Red Hat Certified Engineer or equivalent qualification.
Experience with traditional data warehouse and knowledge of data-marts and star schemas.
Experience using SQL to manipulate data in relational databases.
Understanding on RESTful services and their impacts on system performance.
Hands-on experience using traditional reporting tools like QlikView and Tableau.

This role is handled by Agile Search. If you are interested, apply using the button below. It only takes a minute.

Detta är en jobbannons med titeln "Big Data DevOps" hos företaget Agile Search och publicerades på webbjobb.io den 11 november 2017 klockan 02:00.

Hur du söker jobbet

webbjobb-logo-white webbjobb-logo-grey webbjobb-logo-black