Experience: 3-7+ Years
Responsibilities / Experience:
Minimum 3-4 year of experience in designing and building Big data solutions (Hadoop & Ecosystem).
Long-standing experience with Cloudera & Hortonworks as well as associated Apache projects.
Design and implementation of data-ingestion und processing pipelines.
Continuous integration, continuous delivery, continuous deployment (Jira, Gitlab/TFS, Ansible) as a means of quality assurance
Experience in working in large agile environments & approach.
Experience in working with international teams and excellent communication skills in written and spoken.
Skills:
Knowledge in Big Data engineering processes and tools (e.g. Hadoop, Spark, Scala, Python), conceptual & hands-on
Basic hands on experience with Spark
Good knowledge and hands on experience with programming languages as Scala/Python/Java
Experience with Relational and NoSQL databases
Working experience with Cloudera/MapR/Hortonworks
Job-scheduling with Oozie
Hands on experience on Linux/Unix scripting
Understanding of REST interface
Open to learn new technologies
Job Descriptions for BigData Developer positions:
Mandatory Skills:
Apache Spark
Scala
Python
Java 8
Apache Hadoop
HDFS
Shell / Linux basics
REST
Following kknowledge is add on benefits
Apache Airflow
MapR
Openshift
Jersey
Jetty
@Copyright Go4Oracle Terms of use | Legal Disclaimer
Hosted With Scorpio Informatics Pvt Ltd | Powered by Scorpio CMS
Best Viewed on any Device from Smartphones to Desktop.