Asset & Wealth Management - PWM Data Engineering - Vice President - HyderabadHyderabad, Telangana, India
Asset & Wealth Management - PWM Data Engineering - Vice President - Hyderabad
Asset & Wealth Management - PWM Data Engineering - Vice President - HyderabadHyderabad, Telangana, India

Who We Are

At Goldman Sachs, we connect people, capital and ideas to help solve problems for our clients. We are a leading global financial services firm providing investment banking, securities and investment management services to a substantial and diversified client base that includes corporations, financial institutions, governments and individuals.

In Private Wealth Management, we help our clients pursue their wealth management goals through careful advice & astute investment management. PWM Engineering plays a pivotal role in building the tools and applications our business needs to effectively manage and support our client’s diverse requirements. We support the entire user experience starting with onboarding through to trading, reporting, as well as providing clients access to their portfolios online via native iOS and Android apps. We also build and support numerous applications for our Business and Operations users to help them effectively manage risk and provide the best white-glove service possible to our clients.

The Data Distribution team supports PWM's Quantum data distribution platform which is considered the primary source of data relating to Client Holdings, Transactions, Taxlots and reference data like Accounts, Products and Prices in PWM. Several teams/projects requires this data and our team has built several distribution channels  for different usage patterns.

We are currently enhancing our platform based on new business requirements as well as increasing stability and scalability needs to support a growing business. The platform will be required to manage very high volumes of requests with varying time sensitivities and prioritising across multiple tenants via an event based framework.

Your Impact

 We are seeking a highly skilled and experienced Cloud Database Specialist to join our core data platform team. The ideal candidate will possess deep technical expertise in managing and optimizing modern cloud data stacks, specifically focusing on SnowflakeDatabricks, and/or SingleStore Helios. You will be responsible for architecting scalable data environments, implementing robust performance tuning strategies, and ensuring the seamless integration of high-performance analytical and operational workloads.

This role offers a unique opportunity to work at the intersection of Data Engineering and Cloud Infrastructure, driving the evolution of our global data platform. You will serve as a technical authority, mentoring junior engineers and influencing the strategic direction of our multi-cloud data architecture to support mission-critical business intelligence and AI/ML initiatives.

Key Responsibilities

  • Architecture & Deployment: Design, deploy, and maintain enterprise-grade cloud database environments (Snowflake, Databricks, Helios) across AWS, Azure, or GCP to support high-concurrency applications and large-scale analytics.
  • Performance Engineering: Develop and implement advanced performance tuning strategies, including query optimization, clustering/partitioning logic, and right-sizing compute resources (e.g., Snowflake Warehouses, Databricks SQL Warehouses).
  • Security & Governance: Partner with Identity & Access Management (IAM) and Security teams to integrate best practices such as Role-Based Access Control (RBAC), Row-Level Security (RLS), and automated provisioning via SCIM. Leverage Unity Catalog (Databricks) or Snowflake Horizon for unified governance.
  • Data Modeling: Collaborate with cross-functional teams to design and implement optimized data models (Star Schema, Data Vault, or Medallion Architecture) that balance storage efficiency with query performance.
  • Reliability & Troubleshooting: Act as the primary escalation point for complex database-related issues, ensuring high availability, data integrity, and rapid resolution of performance bottlenecks.
  • FinOps & Capacity Planning: Perform rigorous capacity planning and cost-governance assessments. Monitor credit consumption and implement automated scaling policies to optimize the total cost of ownership (TCO).
  • SDLC & Automation: Drive improvements in database DevOps and CI/CD pipelines. Integrate automated testing frameworks (e.g., dbt tests) and performance benchmarking into the release mechanism.
  • Innovation: Stay at the forefront of cloud data trends, including Iceberg/Delta Lake interoperability, serverless compute, and AI-driven optimization features, to recommend innovative platform enhancements.
  • Mentorship: Provide technical leadership and mentorship to junior team members, fostering a culture of excellence in cloud database administration and engineering.

Basic Qualifications

We are looking for a senior candidate who will play a key role in driving our transformational efforts from India. Strong communication skills are key: we expect you to constructively volunteer reasoned views and opinions in different team forums. Effectively meeting the demands of our challenging mandate requires a passion for learning as well as deep understanding and experience across a wide array of competencies, such as:

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • 10+ years of experience in database administration with at least 5 years of hands-on experience with SingleStore.
  • Strong proficiency in SQL and database performance tuning techniques.
  • Technical Expertise: Proven experience with Snowflake (Snowpark, Streams, Tasks), Databricks (Delta Lake, Spark SQL, Unity Catalog), or SingleStore Helios (Universal Storage, Pipelines).
  • Cloud Proficiency: Strong hands-on experience with at least one major cloud provider (AWS, Azure, or GCP) and its associated networking and security services.
  • SQL Mastery: Expert-level SQL skills, including complex analytical functions, window functions, and stored procedure development.
  • Automation: Experience with Infrastructure as Code (Terraform/Pulumi) and data transformation tools like dbt (Data Build Tool).
  • Strong communication skills (in person, phone, and email).
  • Strong sense of ownership, focus on quality, responsiveness, efficiency, and innovation.

Preferred Qualifications

  • Snowflake Pro Core / Advanced Architect
  • Databricks Certified Data Engineer Professional
  • Cloud Provider Certifications (e.g., AWS Certified Data Engineer, Azure Data Engineer Associate)
  • Experience in the financial services industry.
  • Proficient in Python or Bash, and Java.

Goldman Sachs Engineering Culture

At Goldman Sachs, our Engineers don’t just make things – we make things possible. Change the world by connecting people and capital with ideas. Solve the most challenging and pressing engineering problems for our clients. Join our engineering teams that build massively scalable software and systems, architect low latency infrastructure solutions, proactively guard against cyber threats, and leverage machine learning alongside financial engineering to continuously turn data into action. Create new businesses, transform finance, and explore a world of opportunity at the speed of markets.

Engineering is at the critical center of our business, and our dynamic environment requires innovative strategic thinking and immediate, real solutions. Want to push the limit of digital possibilities? Start here!

© The Goldman Sachs Group, Inc., 2026 All rights reserved.
Goldman Sachs is an equal employment/affirmative action employer Female/Minority/Disability/Veteran/Sexual Orientation/Gender Identity.