--}}

JOB DESCRIPTIONS


Responsibilities

  • Lead the design, development, and maintenance of scalable and efficient data architectures that support the collection, storage, and analysis of large volumes of financial data.
  • Define and implement data modeling standards, ensuring consistency, integrity, and optimization of data structures.
  • Collaborate with data engineers to build and optimize data pipelines for ingesting, processing, and transforming data from various sources.
  • Evaluate and recommend new technologies, tools, and frameworks to enhance our data architecture and infrastructure.
  • Provide technical leadership and mentorship to junior members of the data team, fostering a culture of innovation and continuous learning.
  • Partner with business stakeholders to understand their data requirements and translate them into actionable solutions that drive business value.
  • Ensure compliance with data governance policies, regulatory requirements, and industry best practices related to data management and security.
  • Drive initiatives to improve data quality, accuracy, and reliability through data profiling, cleansing, and validation processes.
  • Collaborate with IT and security teams to implement robust data protection mechanisms and ensure the confidentiality and integrity of sensitive data.
  • Stay abreast of emerging trends and developments in data management, analytics, and technology, and assess their potential impact on our data architecture strategy.


Required competency and skillset

  • Bachelor's degree in Computer Science, Information Technology, or a related field. A master's degree is preferred.
  • Minimum of 12+ years of experience in data architecture, database design, and data modeling, preferably in the fintech or financial services industry.
  • Proven track record of designing and implementing complex data architectures using modern technologies such as cloud platforms (e.g., AWS, Azure, GCP), big data technologies (e.g., Snowflake, Redshift, BigQuery), and distributed computing frameworks (Spark).
  • Expertise in guiding, implementing and operationalizing AI/ML models using cloud tools such as SageMaker, Databricks etc.
  • Extensive experience with relational databases (e.g., PostgreSQL, MySQL), NoSQL databases (e.g., MongoDB, Cassandra)
  • Strong proficiency in data modeling techniques (e.g., ER modeling, dimensional modeling) and data modeling tools (e.g., ERwin, Erwin Data Modeler).
  • Hands-on experience with designing and implementing batch and real time-data pipelines, using integration tools (e.g., Apache Kafka, Fivetran etc) and ETL/ELT processes.
  • Solid understanding of data governance principles, data security best practices, and regulatory compliance requirements (e.g., GDPR, CCPA).
  • Excellent analytical and problem-solving skills, with the ability to troubleshoot complex data issues and optimize performance.
  • Strong communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and influence decision-making at all levels of the organization.
  • Certifications in relevant technologies (e.g., AWS Certified Solutions Architect, Microsoft Certified: Azure Data Engineer) are a plus.

Salary

Competitive

Monthly based

Location

Bengaluru, Karnataka, India

Job Overview
Job Posted:
3 days ago
Job Expire:
2w 5d
Job Type
Full Time
Job Role
Education
Bachelor Degree
Experience
12+ Years
Slots...
1

Share This Job:

Location

Bengaluru, Karnataka, India