AWS Data Engineer Job in Bangalore – Build Scalable Cloud Data Pipelines with Python & Spark at Cognizant
AWS Data Engineer Opportunity in Bangalore
Overview
Seeking a results-driven AWS Data Engineer to collaborate on high-impact projects from our Bangalore office. This role is ideal for professionals with 4 to 8 years of hands-on experience in data engineering who are passionate about developing advanced Lakehouse platforms in the AWS ecosystem. The position offers a hybrid working model, enabling flexibility, collaboration, and innovation in equal measure.
In this role, you will play a pivotal part in designing and developing high-performance, scalable data workflows using Python and Apache Spark. You will be responsible for building robust, enterprise-grade data pipelines that form the backbone of our AWS Lakehouse infrastructure. The ideal candidate will have solid expertise in AWS services, strong Python and Spark skills, and a deep understanding of ETL best practices.
Key Responsibilities
- Design & Implementation: Architect and implement scalable data pipelines that support the ingestion, transformation, and storage of large volumes of structured and semi-structured data using AWS Glue, Lambda, and EMR.
- Data Lakehouse Development: Collaborate with architects and development leads to build Lakehouse solutions that combine the best features of data lakes and data warehouses for improved data governance and analytics.
- Workflow Optimization: Utilize Python and Spark to develop ETL jobs that are optimized for performance, reliability, and maintainability. Ensure that pipelines run efficiently and can handle production-level data loads.
- Cross-Functional Collaboration: Engage with product owners, stakeholders, and fellow engineers to translate business needs into technical requirements. Decompose complex solutions into manageable Epics and user stories.
- Technical Documentation: Prepare detailed technical documents and system designs to facilitate team communication and ensure compliance with internal and external audit requirements.
- Code Quality & Testing: Uphold best practices in software engineering through code reviews, unit testing, and continuous integration strategies.
- Innovation & Learning: Participate in technical discussions, bring innovative ideas to the table, and stay current with emerging trends in cloud-based data engineering.
Core Technical Expertise Required
- Extensive hands-on experience as a Data Engineer, specializing in designing and implementing solutions using cloud-native architecture frameworks
- Strong command of Python and Apache Spark for data processing, transformation, and automation
- Proficient in designing and implementing ETL pipelines using tools and services like:
- AWS Glue
- AWS EMR (Elastic MapReduce)
- AWS Lambda
- AWS Step Functions
- Amazon Athena
- API Gateway
- Familiarity with IAM, RDS, SQS, and other AWS components to support secure, scalable data solutions
- Solid understanding of Agile methodologies, including sprint planning, retrospectives, and story decomposition
- Strong background in data validation, automated testing, and quality assurance practices
Preferred Qualifications
- Familiarity with the Lakehouse architecture, combining elements of data lakes and data warehouses to support a wide range of analytics workloads
- Understanding of data modeling practices to ensure scalable, reliable data products that fulfill both business and technical requirements
- Experience in working with CI/CD pipelines and infrastructure automation tools to enable seamless deployment processes
- Exposure to Financial Services domain is a plus, especially in areas related to Equity and Fixed Income asset classes or index data processing
- Certification in cloud technologies is an added advantage:
- AWS Certified Data Analytics
- AWS Certified Solutions Architect
Educational Requirements
- A Bachelor’s degree in Computer Science, Information Technology, Software Engineering, or a related field is mandatory
- A Master’s degree or relevant professional certifications will be considered a strong advantage
What Sets This Role Apart
This role offers a unique opportunity to work on large-scale data infrastructure projects that directly impact enterprise decision-making. You will be a core part of a team driving real-time analytics, automation, and insight generation using modern data engineering tools.
You will also have the opportunity to:
- Shape the direction of data strategy by contributing to the architectural decisions
- Mentor junior engineers and contribute to a culture of continuous learning and excellence
- Be involved in client-facing activities to gather insights and refine solution design
Why Join Us?
We believe that people thrive when they are empowered, supported, and part of a vibrant community. As a part of our global Cognizant family, you will join a diverse team of over 300,000 professionals dedicated to shaping the future of technology.
- We foster a collaborative and inclusive workplace where diverse perspectives are valued
- We are committed to innovation, sustainability, and community well-being
- Recognized among Forbes World’s Best Employers (2024) and a NASDAQ-100 company
- Opportunities for growth through learning programs, mentorship, and career development paths
Equal Employment Opportunity
Cognizant is proud to be an equal opportunity employer, and we are committed to building a diverse and inclusive workforce. Employment decisions are made regardless of race, color, religion, national origin, gender identity, sexual orientation, age, disability, veteran status, or any other protected status under applicable laws.
If you require reasonable accommodation due to a disability to participate in the application or interview process, please contact us at Ca********@*******nt.com with your request.
Important Notice
Compensation and job conditions are accurate as of the time of posting and are subject to change in accordance with company policy and legal requirements. Shortlisted applicants may be required to participate in in-person or video-based interviews and provide valid state or government-issued identification during the process.
Learn More About Cognizant
Cognizant is one of the world’s leading professional services firms, transforming client operations through digital innovation. Our consultative, industry-based approach helps organizations envision, build, and operate more modern, efficient, and agile enterprises.
To discover how we empower businesses to lead with digital, visit: www.cognizant.com
Keywords: AWS Data Engineer jobs in Bangalore, Python Spark Data Engineering, Lakehouse Architect AWS, AWS Glue Lambda Step Functions, Data Engineering career Bangalore, ETL Developer AWS, Cloud Data Engineer, Data Pipeline Developer, Data Engineer Cognizant, Spark Python AWS Jobs
Conclusion
If you’re a data engineering professional looking for your next challenge in a supportive, innovation-driven environment, this opportunity is tailor-made for you. Contribute to high-impact projects, work with cutting-edge cloud technologies, and grow alongside some of the most talented minds in the industry. Join us and help shape the future of data engineering.
Apply today and be part of a journey where your ideas, skills, and passion create real value.