Sr. Hadoop Developer
Westerville, OH
Contract
Job Description:
  • The Big Data Lead Developer is responsible for design and development of core platform that enables the delivery and construction processes for the Data Management, Data Discovery and Analytics group, leveraging emerging big data technologies.
  • The individual is a subject matter expert technologist with strong Java experience and very knowledgeable with utilization and integration of Open Source software.
  • The individual has deep understanding and application of enterprise software design for implementation of data services and middleware.
  • This is a have been there, done that technologist who thrives on driving efforts to completion while utilizing best in breed technologies and methodologies.
  • The individual should also function as a Solution Architect, must be a visionary, and execution driven.
  • The individual must have successful experience in Big Data implementations for large data integration initiatives.
  • Day-to-day activities will vary widely based on the state of the organizations priorities and needs at that point in time.
  • As such, this individual must be comfortable with flexibility in their role.
  • They must be able to operate in a relaxed, yet confident manner, without explicit hierarchy and structure governing work.
  • An affinity towards, and appreciation of, an influence-based and entrepreneurial culture is critical for success.

Key Responsibilities include:
  • Component Software Design & Development.
  • Ensuring excellent practices are utilized in delivering Big Data Management and Integration Solutions.
  • Ensuring design decisions can be actioned by the development team.
  • Participating in agile development projects.
  • Acting as a role model for all best practices, ensuring consistency across entire team.
  • Mentoring technical development team on optimal utilization of Big Data solutions and Apache Open Source Software. Helping build a great team.
  • Leveraging new and emerging practices for Enterprise Data Architecture.
  • Engage in enterprise-level systems component design and implementation.
  • Systems integration, including design and development of APIs, Adapters, and Connectors.
  • Integration with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions.
  • Write and maintain reference architectures and systems design best practices guidelines.
Required Skills:
  • Excellent analytical, communication, organizational and problem-solving skills coupled with a strong work ethic
  • Ability to translate business requirements into functional requirements documentation.
  • 9+ years’ experience with full development lifecycle from inception through implementation leveraging Java and various Java frameworks
  • 4+ years architecting and implementing applications leveraging common patterns such as SEDA, Lambda, Kappa and similar data processing architectures.
  • 4+ years implementing Big Data technologies including Spark, HDFS, MapReduce, Hive, Sqoop, and similar technologies.
    4+ years leveraging big data consumption tools such as Impala, Hive, Drill, or similar query engines.
  • 4+ years’ experience with Scala and similar Big Data oriented languages
  • Experience with development, deployment, and support of large-scale distributed applications in a mission-critical production environment.
  • Test-infected attitude (strong desire to perform thorough and exhaustive unit, integration and system testing).
    Preparing test plans and performing system testing
  • Experience with TDD utilizing test data, jUnit and Mockito
  • Experience with JSON, XML, XSD and JAXB
  • Experience with Change Management and Incident Management process

This is the corp to corp opportunity , Please share updated Resume asap, thanks

1 Comments

  1. It was great experience after reading this. thanks for sharing such good stuff with us.
    Hadoop Training in Delhi

    ReplyDelete

Post a Comment

Previous Post Next Post