⚠︎ Архивная вакансия
Эта вакансия была перемещена в архив. Возможно она уже не актуальна и рекрутер больше не принимает отклики на эту вакансию. Вы можете  найти актуальные похожие вакансии

Middle+/Senior Data Engineer (Python)

Прямой работодатель  WaveAccess ( waveaccess.ru )
Тбилиси, Грузия
Миддл • Сеньор
Аналитика, Data Science, Big Data • Python • Инженер • SQL • Заказная разработка
28 мая
Удаленная работа
Опыт работы более 5 лет
Работодатель  WaveAccess
Описание вакансии

We create the software of the future. Join us!

We are looking for an experienced data engineer proficient in Python and with extensive experience in working with cloud services, preferably AWS. You will be responsible for designing, implementing, and supporting data processing and storage systems, ensuring high quality and availability of data for business analytics and data analysis applications.

Tech Stack: Python, SQL, Terraform, AWS Cloud

Required Skills and Experience:

  • Bachelor's degree in computer science, engineering, or related disciplines
  • Fluent English
  • Minimum of 5 years of experience as a data engineer or in a similar role
  • Proficiency in Python for data processing and pipeline development
  • Significant experience working with AWS Сloud services related to data storage, processing, and analytics (e.g., S3, Glue, Lambda, Redshift, DynamoDB)
  • Experience with SQL databases, including design, queries, and optimization
  • Familiarity with data modeling, ETL processes, and data warehouse concepts
  • Knowledge of data integration tools and frameworks (e.g., Apache Airflow)
  • Excellent problem-solving skills and ability to work independently and as a part of a team
  • Strong communication skills and ability to explain technical concepts to non-professionals

Desirable Skills:

  • Experience with AWS Glue
  • Minimum of 3 years of professional experience in data engineering or a related field, with specific experience in Snowflake Data Warehouse and data workflows
  • Experience with TypeScript
  • Proficiency in Apache Airflow
  • Familiarity with Linux (including bash scripting) and SSH

Nice to Have:

  • Experience with Apache Spark and Scala
  • Familiarity with Snowflake Data Warehouse
  • Experience working with Data Science teams

Responsibilities:

  • Designing, creating, and implementing highly scalable database tables in the Snowflake Data Warehouse to support data analytics and business analytics
  • Transforming business requirements into data models that are easily understood and usable by various specialties within the company
  • Supporting business requirements through data architecture
  • Collaborating with other technical team members to establish best practices and data management standards
  • Performance tuning of data, troubleshooting, and optimization
  • Ensuring data privacy and security compliance

We offer:

  • Various projects for international clients on modern technology stack
  • Friendly team and enjoyable working environment
  • Regular assessments and salary reviews 

Специализация
Аналитика, Data Science, Big DataPythonИнженерSQL
Отрасль и сфера применения
Заказная разработка
Уровень должности
МиддлСеньор