Engineering-L2-Bengaluru-Vice President-Software EngineeringBengaluru, Karnataka, India
Engineering-L2-Bengaluru-Vice President-Software Engineering
Engineering-L2-Bengaluru-Vice President-Software EngineeringBengaluru, Karnataka, India

About the role:

We are seeking a Data Engineer to design and implement low-latency data pipelines for transferring data across regions with unique networking challenges. The role involves leveraging cloud services, integrating on-premises infrastructure, and ensuring robust data governance through strong auditing, control frameworks, and data filtering. Experience with data cataloging and CBDT-compliant data transfer mechanisms is essential. You will work with both batch and streaming pipelines, focusing on low latency and high data quality.

Key Responsibilities

  • Hybrid Quantum-Classical Architecture: Architect scalable, low-latency backend systems that integrate GS's internal "Big Data" environments with quantum processors and simulators (e.g., AWS Braket, Azure Quantum).
  • Data Modeling with Legend: Utilize and extend Legend (GS's open-source data modeling platform) to create logical and physical data models that support high-dimensional financial datasets for quantum state preparation.
  • Quantum Data Pipeline Engineering: Design robust pipelines to handle the efficient loading of classical data into quantum-ready formats, specifically focusing on block-encodings and Quantum Random Access Memory (QRAM) architectures to minimize circuit depth.
  • Platform Scalability: Develop the "Quantum-as-a-Service" internal platform, focusing on API design, microservices, and cloud-native integration (AWS/Private Cloud) to allow GS Quants to execute quantum algorithms seamlessly.
  • Algorithm Optimization: Collaborate with Quantum Researchers to optimize data retrieval and storage, reducing the "data loading bottleneck" that currently limits quantum speedup in financial modeling.
  • Governance & Security: Ensure all quantum-related data processes comply with GS's rigorous security standards and global financial regulations, implementing enterprise-grade data mesh and zero-ETL patterns.

Required Skills & Qualifications

  • Backend Expertise: Proficiency in JavaPython, or C++ with a focus on building distributed systems and high-performance microservices.
  • Data Engineering: Extensive experience with big data technologies (e.g., SparkKafkaHadoop) and database design (SQL, NoSQL, and Graph databases).
  • Cloud Infrastructure: Strong knowledge of cloud-native architecture, specifically AWS (Lambda, S3, EMR) and containerization (DockerKubernetes).
  • Architectural Patterns: Deep understanding of distributed system design, data mesh, and the Legend modeling language.
  • Quantum Awareness: Conceptual understanding of quantum computing (qubits, gates, circuit depth) and the specific challenges of mapping classical data to quantum states (e.g., state preparation).
  • Education: Bachelor’s or master’s degree in computer science, Engineering, or a related quantitative field.

Preferred Qualifications

  • Experience with financial modeling, specifically Monte Carlo simulations or Black-Scholes models.
  • Familiarity with quantum SDKs such as QiskitBraket, or Cirq.