Big Data & GCP- Senior Software Engineer
Since 1993, EPAM Systems, Inc. (NYSE: EPAM) has leveraged its advanced software engineering heritage to become the foremost global digital transformation services provider – leading the industry in digital and physical product development and digital platform engineering services. Through its innovative strategy; integrated advisory, consulting and design capabilities; and unique ‘Engineering DNA,EPAM’s globally deployed hybrid teams help make the future real for clients and communities around the world by powering better enterprise, education and health platforms that connect people, optimize experiences, and improve people’s lives. Selected by Newsweek as a 2021 Most Loved Workplace.
EPAM’s global multi-disciplinary teams serve 61,300 employees and customers in more than 50 countries across five continents.
As a recognized leader, EPAM is listed among the top 15 companies in Information Technology Services on the Fortune 1000 and ranked as the top IT services company on Fortune’s 100 Fastest-Growing Companies list for the last three consecutive years.
EPAM is also listed among Ad Age’s top 25 World’s Largest Agency Companies and in 2020, Consulting Magazine named EPAM Continuum a top 20 Fastest-Growing organization.
- Design, developing and maintaining the data architecture, data models and standards for various Data Integration & Data Warehousing projects in GCP cloud, combined with other technologies
- Ensure the use of Big Query SQL, Java/Python/Scala and Spark reduces lead time to delivery and aligns to overall group strategic direction so that cross-functional development is usable
- Ownership of technical solutions from design and architecture perspective, ensure the right direction and propose resolution to potential data pipeline-related problems.
- Expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams.
- Provide technical guidance and support to a vibrant engineering team. Coaching and teaching your teammates how to do great data engineering.
- A deep understanding of data architecture principles and data warehouse methodologies specifically Kimball or Data Vault.
- An expert in GCP, with at least 5-7 years of delivery experience with: Dataproc, Dataflow, Big Query, Compute, Pub/Sub, and Cloud Storage
- Highly knowledgeable in industry best practices for ETL Design, Principles, and Concepts
- Equipped with 3 years of experience with programming languages – Python
- A DevOps and Agile engineering practitioner with experience in a test-driven development
- Experienced in the following technologies: Google Cloud Platform, Dataproc, Dataflow, Spark SQL, Big Query SQL, PySpark and Python/Scala
- Experienced in the following BigData technologies: Spark, Hadoop, Kafka etc..
- Big Data
- Insurance Coverage
- Paid Leaves – including maternity, bereavement, paternity, and special COVID-19 leaves.
- Financial assistance for medical crisis
- Retiral Benefits – VPF and NPS
- Customized Mindfulness and Wellness programs
- EPAM Hobby Clubs
- Hybrid Work Model
- Soft loans to set up workspace at home
- Stable workload
- Relocation opportunities with ‘EPAM without Borders’ program
- Certification trainings for technical and soft skills
- Access to unlimited LinkedIn Learning platform
- Access to internal learning programs set up by world class trainers
- Community networking and idea creation platforms
- Mentorship programs
- Self-driven career progression tool
Send us your CV to get a personalized offer.