Cloudera

Solutions Consultant (BB-952CF)

Found in: Neuvoo CN

Description:

Job Description:

Cloudera is seeking an experienced Solutions Consultant to join our team. This key role has two major responsibilities: first to work directly with our customers and partners to optimize their plans and objectives for architecting, designing and deploying Apache Hadoop environments, and, secondly, to assist in building or designing reference configurations to enable our customers and influence our product. The Solutions Consultant will facilitate the communication flow between Cloudera teams and the customer. For these strategically important roles, we are seeking outstanding talent to join our team.

Responsibilities:

Work directly with customer business and technical teams to understand requirements and develop high quality solutions

Design highly scalable and reliable data pipelines to consume, integrate, and analyze large amounts of data from various sources.

Able to understand big data use-cases and recommend standard design, implementation patterns used in Hadoop-based deployments

Able to document and present complex architectures for the customer’s technical teams

Work closely with Cloudera teams at all levels to ensure project and customer success

Design effective data models for optimal storage and retrieval, deploy inclusive data quality checks to ensure high quality of data

Design, build, tune and maintain data pipelines using Hadoop, NiFi or related data integration technologies

Install, deploy, augment, upgrade, manage and operate large Hadoop clusters

Write and produce technical documentation, customer status reports and knowledgebase articles

Keep up with current Hadoop, NiFi, Big Data ecosystem / technologies.

Qualifications:

Overall 8+ years IT experience, with at least 4+ years of production experience working with Hadoop and/or NiFi, data engineering.

Hands-on experience with all aspects of developing, testing and implementing low-latency big data pipelines.

Demonstrated production experience in data engineering, data management, cluster management and/or analytics domains.

Experience designing data queries against data in the HDFS environment using tools such as Apache Hive

Experience implementing MapReduce, Spark jobs

Experience setting up multi-node Hadoop clusters

Experience in systems administration or DevOps experience with one or more open-source operating systems [Big Data Developers interested in Administration and consulting can also apply]

Experience with Data Warehouse design, ETL (Extraction, Transformation & Load), architecting efficient software designs for DW platform.

Experience implementing operational best practices such as alerting, monitoring, and metadata management.

Strong understanding with various enterprise security practices and solutions such as LDAP and/or Kerberos

Experience using configuration management tools such as Ansible, Puppet or Chef

Familiarity with scripting tools such as bash shell scripts, Python and/or Perl

Experience with Apache NiFi is desired

Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP

Understanding of the Java ecosystem and enterprise offerings, including debugging and profiling tools (e.g. jstack, jmap, jconsole), logging and monitoring tools (log4j, JMX)

Ability to understand and translate customer requirements into technical requirements

Excellent verbal and written communications

Nice, but not required experience:

Site Reliability Engineering concepts and practices

Knowledge of the data management eco-system including: Concepts of data warehousing, ETL, data integration, etc.

Experience using a compiled programming language, preferably one that runs on the JVM (Java, Scala, etc)

Experience coding with streaming/micro-batch compute frameworks, preferably Kafka, Spark

  • Posted Today
  • Full time
  • 200648
  • calendar_today1 day ago

    Similar jobs

    location_onBeijing, China

    work Cloudera

    Apply:
    I expressly authorise the Terms and Conditions