Job Description
Position Tittle:
**Data Engineer / Data Modeler (DataStage / Payments Subsystem)**
Location: Onsite 2-3 days a week in Atlanta, GA.
Visa: H1B TRANSFER
Duration: Long Term
**Job Details:**
experienced Data Engineer / Data Modeler to join our banking technology team, with a focus on the payment’s subsystem. The role involves designing, developing, and maintaining scalable data pipelines and models to support real-time and batch payment processing, settlement, reconciliation, and regulatory reporting. The ideal candidate will have strong expertise in data modeling, ETL development, cloud data platforms, and modern data engineering practices.
– **Key Responsibilities**
– Design and implement logical and physical data models for the payment’s domain (merchant onboarding, transactions, clearing, settlement, funding, fraud/risk, reporting).
– Develop ETL/ELT pipelines to integrate data from core banking systems, payment gateways, and external partners into enterprise data stores.
– Ensure data quality, lineage, and governance across payment processing flows.
– Build and optimize data warehouses/lakes on cloud platforms (AWS or Azure).
– Collaborate with architects, developers, and business analysts to map payment business processes into data models and schemas.
– Implement real-time and batch data integration for use cases such as fraud detection, reconciliations, settlement reporting, and regulatory compliance.
– Monitor and tune pipelines for scalability, performance, and cost efficiency.
– Partner with security and compliance teams to enforce data privacy and regulatory requirements (PCI-DSS, SOX, AML/KYC).
**Required Qualifications**
– Bachelor’s degree in computer science, Information Systems, or related field (or equivalent experience).
– 7+ years of data engineering / data modeling experience in financial services or banking.
– Strong knowledge of payments domain concepts (auth, clearing, settlement, reconciliation, chargebacks, fraud).
– Hands-on experience with data modeling (3NF, star/snowflake schemas, dimensional modeling).
– Proficiency with SQL and at least one ETL/ELT tool (DataStage, Informatica, Talend, dbt).
– Strong experience with cloud data platforms (AWS Redshift, Azure Synapse, Snowflake, Databricks).
– Proficiency in a programming language such as Python for data processing and automation.
– Experience with real-time streaming (Kafka, Kinesis, EventHub) and batch data pipelines.
– Familiarity with version control (Git) and CI/CD pipelines.
**Preferred Skills**
– Knowledge of regulatory and compliance requirements in payments/banking.
– Understanding of API-based integration for payment systems.
– Familiarity with containerization (Docker, Kubernetes) for data services deployment.