5+ years of relevant professional experience
Experience with Hadoop (or similar) Ecosystem (MapReduce, Yarn, HDFS, Hive, Spark, Presto, Pig, HBase, Parquet)
Proficient in at least one of the SQL languages (MySQL, PostgreSQL, SqlServer, Oracle)
Good understanding of SQL Engine and able to conduct advanced performance tuning
Strong skills in scripting language (Python, Ruby, Bash)
1+ years of experience with workflow management tools (Airflow, Oozie, Azkaban, UC4)
Launch your career - Create your profile now!Create your Profile
Loading some great jobs for you...