Job Description
**Job Title: Data Architect (Credit Risk)**
**Job Location: New York City (Hybrid) (Local Only)**
**Job Type – Contract**
**Visa: H1b Not workable**
**Job Description:**
We are seeking a highly skilled and experienced application Engineer and Data Architect to join our dynamic Credit Risk Capital IT team within
**Credit Risk**
Technology Group. As a Senior resource, you will play a crucial role in designing, implementing, and maintaining the application and infrastructure for Credit Risk Capital.
**Job Responsibilities and Requirements**
– Lead architecture/technical design discussion considering industry standard technologies and practice, support production operation and resolve production issues as a senior developer in Credit Risk application team
– Design and implement batch and ad-hoc data pipelines with the Medallion
**Lakehouse**
architecture using modern cloud data engineering patterns, primarily in
**Databricks**
– Build and maintain ingestion flows from upstream systems into object storage (e.g., S3/ADLS) using formats such as Parquet, including partitioning, z-ordering and schema evolution
– Integrate with external XVA / risk engines, implement orchestration logic to manage long-running external computations
– Model and optimize risk measures (like EPE, PFE) for efficient query and consumption by BI tools, notebooks, and downstream applications
– Ensure platform reliability, observability, security (IAM roles, OIDC/Bearer-token auth, encryption) and auditability.
– Contribute to API design for internal consumers and external services (versioning, error handling, SLAs). Be efficient at documenting.
**Required Qualifications, Skills and Capabilities**
– Work experience of 12-15 years as an application developer
– AWS Certified Cloud Practitioner (or equivalent certification; level to be confirmed)
– Proficient in REST API development using but not limited to Django / Flask / FastAPI etc.
– Strong domain expertise in Credit Risk and Counterparty Risk
– Expert-level proficiency in
**Python**
including
**PySpark**
/
**Spark**
for data engineering and analytics
– Hands-on experience with
**Azure**
**Databricks**
, including Medallion Architecture – Lakehouse
– Solid understanding of SQL, including joins, unions, stored procedures, and query optimization
– Familiarity with front-end and back-end development (experience is a plus)
– In-depth knowledge of CI/CD pipelines using Git, Jenkins, and Azure DevOps
– Exposure to technical architecture design (experience is a plus)
– Experience in creating product specifications, architecture diagram and design documents
– Proven experience in Agile software development using JIRA, Confluence, and Zephyr
– Strong communication skills—able to convey complex ideas clearly and concisely
– Highly collaborative team player with a proactive, self-starter mindset
– Demonstrated ability to learn new technologies quickly
– Passionate about coding, development and continuous improvement
**-Thanks & regards**
**Akram Khan**