Skip To Main Content
backgo to search

senior data software engineer

bullets
Data Software Engineering, Python, Amazon Web Services, Databricks, Apache Airflow, Apache Spark, CI/CD, SQL, REST API

Are you an experienced Data Software Engineer looking for a new challenge? Join our team as a Senior Data Software Engineer and play a pivotal role in supporting Data Science teams, building efficient data solutions, and contributing to cutting-edge analytics projects. Proficiency in Python and AWS is key to success in this role.

responsibilities
  • Collaborate closely with Data Science teams, providing essential support and mentorship as they work on analytical projects
  • Take charge of building and maintaining tables, responding to incoming support tickets, and offering guidance through minimal ticket interactions
  • Utilize your expertise to construct datamarts and separate data models that cater to specific analytics needs
  • Engage in sprint-based development, ensuring timely delivery of data solutions aligned with Data Scientists' requirements
  • Handle ad-hoc data requests and provide on-call support as necessary
  • Contribute to the development of data ingestion pipelines, enhancing the efficiency of data processing workflows
  • Publish comprehensive documentation to aid users in effectively utilizing the data solutions you provide
  • Collaborate effectively within the team and demonstrate exceptional communication and documentation skills
  • Embrace constructive feedback and exhibit a proactive attitude towards continuous learning and adapting to new technologies
requirements
  • A minimum of 3 years of relevant experience in Data Software Engineering, showcasing a strong track record of successful project delivery
  • Proficiency in Python, enabling you to develop robust and efficient data solutions
  • In-depth knowledge of Amazon Web Services, utilizing cloud resources to optimize data processing and storage
  • Experience with Apache Airflow, contributing to streamlined workflow orchestration
  • Familiarity with Apache Spark, enhancing your ability to work with large-scale data processing
  • Solid understanding of ETL processes, ensuring efficient data movement and transformation
  • Proficiency in CI/CD practices, facilitating smooth and automated software delivery
  • Strong command of SQL, enabling you to manipulate and query complex datasets
  • Experience working with REST APIs, facilitating seamless data integration with other systems
  • Fluent English communication skills at a B2+ level, ensuring effective collaboration within an international team
nice to have
  • Familiarity with DataBricks and Spark, further enhancing your ability to work with big data
  • Experience with Redshift, contributing to optimized data warehousing solutions

benefits for locations

poland.svg
ImageImage
For you
  • Discounts on health insurance, sport clubs, shopping centers, cinema tickets, etc.
  • Stable income
  • Flexible roles
ImageImage
For your comfortable work
  • 100% remote work forever
  • EPAM hardware
  • EPAM software licenses
  • Access to offices and co-workings
  • Stable workload
  • Relocation opportunities
  • Flexible engagement models
ImageImage
For your growth
  • Free trainings for technical and soft skills
  • Free access to LinkedIn Learning platform
  • Language courses
  • Free access to internal and external e-Libraries
  • Certification opportunities
  • Skill advisory service
don't have time? Apply later!We send you a link to the job in your e-mail
get job alerts in your inboxHundreds of open jobs for Software Engineers, QA, DevOps, Business Analysts and other tech professionals
a smiling man wearing sunglasses