Software Engineer II - PySpark, AWS

Aumni

Aumni

Software Engineering
Hyderabad, Telangana, India
Posted on Oct 14, 2025

Software Engineer II - PySpark, AWS

Hyderabad, Telangana, India

Job Information

  • Job Identification 210673790
  • Job Category Software Engineering
  • Business Unit Corporate Sector
  • Posting Date 10/14/2025, 02:24 AM
  • Locations MAGMA,UNIT-1,PHASE-IV,SY NO.83/1,PLOT NO 2, GR Floor TO 2 Floor and 5 Floor TO 16 Floor,Basement 1,2, Hyderabad, IN-TG, 500081, IN
  • Apply Before 10/30/2025, 09:00 PM
  • Job Schedule Full time

Job Description

You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you.

As a Software Engineer II at JPMorgan Chase within the consumer and community banking Risk, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role.

Job responsibilities

  • Executes software solutions, design, development, and technical troubleshooting for technology products and systems
  • Writes secure and high-quality code using Python Spark with guidance from senior team members
  • Designs, develops, and troubleshoots software with consideration of upstream and downstream technical implications
  • Applies tools within the Software Development Life Cycle to improve automation and value delivery
  • Gathers, analyzes, and draws conclusions from large, diverse data sets to support secure, stable application development
  • Learns and applies system processes and methodologies for developing secure, stable code and systems
  • Adds to team culture of diversity, inclusion, and respect

Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 2+ years applied experience
  • Experience in system design, application development, testing, and operational stability
  • Proficiency with distributed computing frameworks such as Apache Spark, especially Python Spark
  • Experience building data pipelines on AWS using Lambda, SQS, SNS, Athena, Glue, and EMR
  • Proficient in coding in PySpark
  • Experience in developing, debugging, and maintaining code in a corporate environment
  • Experience writing SQL queries and knowledge of Databricks

Preferred qualifications, capabilities, and skills

  • Exposure to cloud technologies
  • Familiarity with the financial services industries.