• Notera att ansökningsdagen för den här annonsen kan ha passerat. Läs annonsen noggrant innan du går vidare med din ansökan.

You will,
Provision the Hadoop cluster and complementary tools.
Monitor and maintain the ecosystem to guarantee uptime.
Automate deployments of application code and configuration changes.
Implement security and audit mechanisms to guarantee data security.
Support the Big Data deployment and handle incidents.
Manage a team of exceptionally talented Dev Ops engineers.
Continuously improve performance of the team and the technology stack.
Bring world-class knowledge on processes to ensure data quality.
Integrate the Big Data platform with our existing Data Warehouse
Work closely with development teams, data warehouse teams and other business stakeholders
Impart best practice learnings to the wider Big Data team,
How will you be measured?
You (and your team) will be responsible for,
Keeping the deployment running with less than 5% downtime in production
Respond to incidents according to the corporate SLA
Interface with 3rd parties to ensure fixes are delivered quickly and robustly
Gatekeeping the access to and usage of the big data platform
Ensuring your team is sufficiently motivated and focused on delivering the best quality
Nurturing team members and helping them achieve job satisfaction

Strong experience in Linux system administration and Hadoop platform setup, monitoring, maintenance and support.
Received HDP CERTIFIED ADMINISTRATOR or CCA Administrator certification.
Clear hands-on experience of using Kerberos, LDAP and Active Directory
Experience in programming in Bash and Python, and preferably Java and Ruby
Experience in building and deploying application on Spark, Storm, Kafka, HBase and Cassandra
Experience using Splunk for system monitoring.
Deep understanding of real-time data processing concepts with knowledge of the industry best practices.
A keen mind with an appetite for problem solving
A demeanour that is personable and articulate with good sense of managing a team.
Managed a distributed Dev Ops team, preferably worked with an outsourcing company.
Passion for open source technologies and a desire to apply them to large volumes of data.
You could have (not must),
Received Red Hat Certified Engineer or equivalent qualification.
Experience with traditional data warehouse and knowledge of data-marts and star schemas
Experience using SQL to manipulate data in relational databases
Understanding on RESTful services and their impacts on system performance.
Hands-on experience using traditional reporting tools like QlikView and Tableau.

Kindred is one of the largest online gambling companies in the world with over 15 million customers across 100 markets. The company was founded in 1997 by Anders Ström and listed on Nasdaq Stockholm in 2004. Kindred is committed to offer our customers the best deal and user experience possible, while ensuring a safe and fair gambling environment.
We offer pre-game and live Sports betting, Poker, Casino and Games through 10 subsidiaries and brands across our markets. While our core markets are in Europe and Australia, we address most global markets.
Kindred is a pioneer in the online gambling industry and as an innovation driven company that builds on trust, we have led the development into areas such as technological advancements, mobile solutions, new product launches, as well as player safety and responsible gambling improvements.

Detta är en jobbannons med titeln "Big Data DevOps" hos företaget North development ab och publicerades på webbjobb.io den 2 June 2017 klockan 00:00.

Hur du söker jobbet

webbjobb-logo-white webbjobb-logo-grey webbjobb-logo-black