Senior Data Lake Infrastructure Dev (19098)

3/2020 (9m)
Kontrakt přes CP
160 000 CZK
« zpět
Tato pozice není aktuálně dostupná
Are you looking for a job that has a worlwide reach and deep purpose? Then come join a global healthcare leader as a Senior Lake Infrastructure (Big Data) Developer.

Location: Prague
Form of cooperation: Freelance contract
Start of cooperation: 01/2020

About the project: 
Create awesome digital products and enjoy a reward that technology careers don’t often bring: the satisfaction of helping to save lives. As a part of Big Data Platform team you will contribute to the client‘s Data Lake and cooperate with other teams to tackle the biggest opportunities at the intersection of healthcare, information and technology. 

What will you be working on: 
- responsible for implementation and ongoing administration of Data Lake infrastructure
- design, develop, test and implement a migration of Hadoop services to AWS EMR, Glue and Redshift services over S3
- performance tuning of Data Lake infrastructure
- automation of manual tasks using Ansible
- collaborating with application teams to install operating system and service updates, and patches
- Hadoop services support and maintenance - HDFS, Hive, HBase, Spark and Kafka
- Elastic Cloud Enterprise platform deployment, monitoring and support
- TICK Stack platform deployment, monitoring and support
- research and recommend technical and operational improvements for improved reliability and efficiencies

What you need to know: 
- strong experience with UNIX/LINUX based systems & scripting (either of Bash or Python)
- knowledge of Hadoop ecosystem - YARN, MapReduce, HDFS, HBase, Zookeeper, Kafka, Spark, Hive
- strong experience with configuration management tools such as Ansible, Puppet, Chef or Salt
- knowledge of directory services such as LDAP & ADS
- knowledge of monitoring tools such as Nagios, Telegraf, Munin...
- knowledge of Elastic Stack or TICK Stack
- knowledge of AWS EC2, and S3 services
- distributed systems troubleshooting skills
- ability to communicate in English

Nice to have:
- experience with configuring security in Hadoop using Kerberos or PAM
- knowledge of AWS EMR, Glue, Redshift services
- experience with cloud services such as AWS
- experience troubleshooting Java applications
- experience with agile development