SoftServe 0 — Без досвіду EN Не обов'язковий Львів, Київ, Харків, Дніпро
  • DBA
  • Data Science
04.01.22

Про роботу

WE ARE

Successfully cooperating with our client, one of worldwide providers of enterprise data cloud. Currently, we are looking for a strong BigData expert to join APAC Professional Services team. In this role, you’ll have the opportunity to develop massively scalable solutions to solve complex data problems using Hadoop, NiFi, Spark, and related Big Data technology. This role is a client-facing opportunity that combines consulting skills with deep technical design and development in the Big Data space.

YOU ARE

  • Experienced in Information Technology and System Architecture
  • Possessing Professional Services (customer-facing) experience, architecting large scale storage, data center, and /or globally distributed solutions
  • Experienced in designing and deploying 3 tier architectures or large-scale Hadoop solutions
  • Able to understand big data use-cases and recommend standard design patterns commonly used in Hadoop-based and streaming data deployments.
  • Possessing knowledge of data management eco-system including concepts of data warehousing, ETL, data integration, etc.
  • Skillfully understanding and translating customer requirements into technical requirements
  • Accustomed to implementing data transformation and processing solutions
  • Confident in designing data queries against data in the HDFS environment using tools such as Apache Hive
  • Accustomed to setting up multi-node Hadoop clusters
  • Skilled in configuring security configurations (LDAP/AD, Kerberos/SPNEGO)
  • Able to implement software and/or solutions in the enterprise Linux environment
  • Having a deep concept of various enterprise security solutions such as LDAP and/or Kerberos
  • Possessing a strong understanding of network configuration, devices, protocols, speeds, and optimizations
  • Showing a solid understanding of the Java ecosystem including debugging, logging, monitoring, and profiling tools
  • Familiar with scripting tools such as bash shell scripts, Python and/or Perl, Ansible, Chef, Puppet
  • Having a solid background in Database administration or design
  • Experienced in architecting data center solutions — properly selecting server and storage hardware based on performance, availability, and ROI requirements
  • Excellent at verbal and written communications

YOU WANT TO WORK WITH

  • Designing and implementing Hadoop and NiFi platform architectures and configurations for customers
  • Performing platform installation and upgrades for advanced secured cluster configurations
  • Analyzing complex distributed production deployments, and making recommendations to optimize performance
  • Ability to document and present complex architectures for the customer’s technical teams
  • Writing and producing technical documentation, blogs, and knowledgebase articles
  • Keeping current with the Hadoop Big Data ecosystem technologies
  • Attending speaking engagements when needed

TOGETHER WE WILL

  • Work directly with customers to implement Big Data solutions at scale
  • Participate in the pre-and post-sales process by helping both the sales and product teams to interpret customers’ requirements


Прибрати рекламу інших компаній і рекламувати свою.
Дізнайтесь більше