Hadoop Developer

IT Solutions Tokyo, Japan


Position at Capgemini Japan

Duties and Responsibilities

  • Providing engineering supports in coming projects and operations
  • Design, develop and maintain scala/spark applications on the Hadoop ecosystem
  • Supporting database performance tuning with discussing with the client
  • Supporting engineering security inspections
  • Basic Hadoop administration and job monitoring

Requirements and Qualifications

  • Sound understanding of the Hadoop ecosystem
  • Hands-on experience of Hadoop components: HDFS, Hive, HBase, Phoenix, Solr, Oozie
  • Minimum 3+ years of experience in Scala
  • Strong coding expertise with Scala and Spark.
  • Good understanding of database concepts and SQL
  • Experience with Unix and shell scripts
  • Good knowledge of git and sbt
  • Experience of database performance tuning for Oracle and SQL Server
  • Working experience on Hadoop framework including HDFS, Hive, HBase, MapReduce, Oozie, Phoenix, Solr
  • Experience in Scala, Spark 2.0 and tools such as Eclipse or Intellij
  • Experience in RDBMS and SQL, Unix/Linux shell scripting, python, Java and cloud computing
  • Data transformation using spark, streaming


Buisness Level Japanese OR Business Level English



キャップジェミニについての詳細は、www.capgemini.com/jp-jp/ をご覧ください。
Get the Future You Want - 望む未来を手に入れよう