Про роботу
Our company is looking for a Data Engineer to join a global industrial project, with the passion to drive the design, execution, and ongoing support of data services enabling large-scale data collection, near real-time and offline analytics, distributed search, using AI and ML for security, engineering, and business intelligence purposes.
Our client is the biggest industrial company, focused on building materials. They are leaders in delivering product excellence, exceptional customer experiences and market innovation.
Stack of Technologies:
GCP;
BigData;
Spark or mapReduce;
SQL/BigQuery/PubSub
BigQuery, Cloud Composer, Data Fusion, GCS and GKE;
Python or JavaScript;
Terraform scripting.
About the role:
As a Data Engineer you will work with one of the largest B2B data sets in the world using the latest cutting edge technologies. You will have an opportunity to build data pipelines and analyze data structures. And connect sources to data lake and work with Geodata.
Responsibilities:
• Maintain and support existing ingestion pipelines;
• Optimize queries based on performance testing;
• Design and implement new ingestion pipelines that bring data from external data sources (HTTPS, SFTP) or internal data sources (JDBC, HTTP, MQQT);
• Work with Application engineers and product managers on refining data requirements;
• Implement and test fine grained access control setup (per dataset, per column).
Required skills:
• Strong experience in building cloud native data engineering solutions using GCP or AWS platforms;
• 2+ years of development experience with BigData;
• Prior experience of building data ingestion pipelines of telemetry data in GCP/AWS including app monitoring, performance monitoring and network monitoring logs;
• Background in building data integration applications using Spark or mapReduce frameworks;
• Track record of producing software artifacts of exceptional quality by adhering to coding standards, design patterns and best practices;
• Strong background in SQL / BigQuery / PubSub;
• Experience with GCP products such as BigQuery, Cloud Composer, Data Fusion, GCS and GKE or corresponding technologies on AWS platform;
• High proficiency in working with Git, automated build and CI/CD pipelines;
• ETL scripting in Python or Javascript;
• Knowledge in some terraform scripting (adding new datasets, buckets, IAM);
• Intermediate level of English or higher.
Our benefits:
• Strong opportunities for professional and career growth - Meetups, TechClubs, Professional Library and more;
• Challenging tasks with a friendly experienced team;
• Flat hierarchy without micromanagement — our doors are open, and all teammates are approachable;
• Direct communication with stakeholders and the ability to influence product development;
• Up to 50% compensation of educational courses and conferences price for professional growth;
• Free English classes and The Business English Course;
• 23 business days leave and medical Covid support;
• Legal and accounting services;
• Regular team events and activities;
• Gifts for significant life events.
About the Company:
Techstack is a full-stack development company that works on products that matter and creates craft solutions for businesses and people.
We are not solo players, but a band with common goals to shift our expertise and get better things.
Now we are 100+ high-skilled specialists, and we keep growing as a team and as professionals.
Відгуки про роботу в Techstack 3
Анастасия Винник - Content Marketing Manager - Харьков - працівник, з 2020
29.06.21Анастасия Винник - Content Marketing Manager - Харьков - працівник, з 2020
Katerina Kotlyarova - QA test automation engineer - Kharkiv - працівник, з 2019
17.05.21Katerina Kotlyarova - QA test automation engineer - Kharkiv - працівник, з 2019
Богдан Соробей - Front-End Developer - Kharkiv - працівник, з 2020
24.03.21Богдан Соробей - Front-End Developer - Kharkiv - працівник, з 2020