We provide IT Staff Augmentation Services!

Senior Data Solution & Engineering Architect Resume

2.00/5 (Submit Your Rating)

SUMMARY:

  • Currently working as Senior Data Solution & Engineering Architect with ETL and AI/ML for Confidentialin US using Agile Methodology.
  • Have 14 Years of experience in ETL Solutioning & Data Architecture with 7 years of onshore direct client exposure.
  • Working on Data pipeline, Data Flow, Data in rest and Data in Motion.
  • Good ETL knowledge to extract, transform and load using python, abinitio, unix, spark etc.
  • Working with DB2, Oracle, MySql Server, IMS (Mainframe), NoSQL MongoDB and Microsoft Access with well - versed Data Warehousing concepts.
  • Good Data modeling experience on OLTP systems using Erwin and Hackolade.
  • Worked on Data Virtualization using Teiid and Spark, RDF graph Data, Solr Search and Fuzzy Algorithm.
  • Working on Big Data environment using Big Data Technologies like Hive, Hue, Spark, python, pandas, Scala, Zeppelin, Ambari, Jena Sparql etc.
  • Working on Artificial Intelligence, Machine Learning models using Sklearn & Keras Tensorflow.
  • Working on microservice architecture, python as a service, Docker Swarm and Gitlab CI/CD.
  • Worked with ETL Tools like Abinitio & Informatica.
  • Worked in multiple roles like Senior Data Solution & Engineering Architect, Technical Architect, Technical Manager, Project Lead & ETL Designer.
  • Worked with different Project documents like Status Tracker, Defect Metrics, Traceability Matrix, Milestone Tracker etc.
  • Led multiple applications in past having team size of 5-12 members.
  • Worked in Financial Sector for Banking and Insurance clients.
  • Worked in different phases of SDLC starting from Requirement Gathering, Architecture, Design, Development, Maintenance, L2 and L3 support.

TECHNICAL SKILLS:

Languages: python, SQL, UNIX Shell Scripting, flask, pymongo, pandas, Access VBA, Scala, Jena Sparql, C, JavaPrimary Tools: Sklearn, Keras, Tensorflow, Big Data Technologies, Erwin, Hackolade, Abinitio (3.1.3.4, 2.15. 2.14, 2.13), Unix, Java, Teiid, Docker, Gitlab, Solr.

Databases: Oracle, MsSQL Server, MongoDB, DB2, IMS, Access, Teradata, MariaDB

Big Data Technologies: Hadoop (Cloudera and Hortonworks), Hive, Hue, Ambari, Spark, Zeppelin

Abinitio Features: GDE, EME, Conduct>It, PDL, Continuous Flow, ACE, BRE, Operational Console

Other Tools: Informatica 9.6.1, SAS, Rainstor 7.01.01, Maphub, PVCS, HP Quality Centre, VSTS

Operating System: DOS, LINUX, Mainframe, WIN 2000, XP, 98, 7 & 10

Microsoft Office: Word, Excel, Visio, Power Point, Outlook

Domain: Banking and Insurance.

Regulatory Requirement: Basel 2, Solvency 2

PROFESSIONAL EXPERIENCE:

Confidential

Senior Data Solution & Engineering Architect

Responsibilities:

  • Connect and Discuss with key Business Stake Holders about their pain points and gather requirements.
  • Create Data Flow and Data Architecture Diagram using Microsoft Visio.
  • Anayze Data from different sources and formats like Databases (Oracle, SQL Server, DB2 etc), Files system (json, xml, csv, delimited), No Sql (MongoDB)
  • Design ETL and Data pipeline using python and Unix.
  • Create Python micro-service and deploy through Docker and Gitlab CI/CD.
  • Work with Business to understand AI/ML use cases and profile historical data for Feature Identification.
  • Working on Decision Trees and Neural Network models.
  • Working on NLP processing to classify Profession Descriptions of external customers.
  • Create, Train and Test Machine learning models for different use cases like Business Unit and Business Class Determination using python, sklearn and Keras libraries.
  • Use Apache spark (Scala) in Big Data environment for reading RDF graph data and load into Hive Tables.
  • Use Spark (Python) and Teiid for Data Virtualization i.e. extract data from different Relational, No-SQL databases, Hive and file system to bring into one logical layer.
  • Use pymongo to read and write data into MongoDB.
  • Use Fuzzy Algorithm and Solr Search for Customer and Agent identification.
  • Detailed profiling and analysis of different source systems in Oracle, DB2, MsSql Server, MainFrame (IMS).
  • Create and Update Logical and Physical Data models for New Requirements and generate DDL for Relational Databases.
  • Maintain Models in ModelMart and Shared Drive.
  • Work with No-SQL MongoDB Database to create data model and generate json sample for Development Team.
  • Perform Proof of Concept (POC) for future projects in different Technologies like python, Java, Access VBA and Big Data Technologies.
  • Give Business Demo to all client stake holders.

Environment: Erwin 9.6.5, python 2.x and 3.x, Unix Shell Scripting, Gitlab, Docker swarm, pandas, flask, sklearn, Keras, Unix, MongoDB, Oracle, MsSql Server, DB2, Java, Hive, Zeppelin, Hue, Ambari, Spark, Scala, Sparql Jena, Access VBA, IMS Explorer

Confidential

ETL Technical Architect & Technical SME

Responsibilities:

  • Detailed analysis of SAS source system to identify inactive data.
  • Define solution to extract data from SAS server through Informatica and land target and audit files in Hadoop
  • Server HDFS
  • Define solution to create similar table structure as Source in Rainstor.
  • Define Solution for end to end auditing.
  • Give walkthrough of the solution to all client stake holders.
  • Support Project Manager with daily and weekly status.
  • Help team in getting access and setting up Pilot environment.
  • Resolving Technical and Application Queries of Team.
  • Help in the Installation process for Rainstor.
  • Help in on-boarding and KT of new members.

Environment: Informatica ETL tool (9.6.1), SAS, Rainstor 7.01.01, Unix with shell scripting, Hadoop, HDFS

Confidential

Project Lead & ETL Technical SME

Responsibilities:

  • Help in configuring and testing code in UAT and PROD environment
  • Implement the solution in Abinitio and Unix and load the data in Oracle database.
  • Helping the team in resolving the technical and requirement queries.
  • Complete all adhoc-requests raised by client.
  • Prepare Weekly Status Tracker.
  • Attend all the review/status meetings.
  • Monitor daily activities of individual team member.
  • Help in on-boarding and KT of new members.

Environment: Abinitio ETL tool (3.1.2.2), Abinitio Ops Console, Unix (AIX) with shell scripting, Oracle 10g as Database.

Confidential

Offshore ETL Designer & Project Lead

Responsibilities:

  • Analyse the requirements coming in the form of Functional Specs, Interface Specs and Mapping Documents
  • Coordinating with client SME in streamlining the requirements.
  • Create the Design framework and give overall solution using Abinitio ETL Tool and generic utilities available in the project.
  • Create the low level design (LLD) and pass it to development team.
  • Guide the Dev team in understanding the requirements, design and help them to deliver the required code.
  • Prepare and update Daily Status Tracker.
  • Prepare Estimation tracker and allocate the work to respective team members.
  • Help Project Manager in preparing different Project artefacts.

Environment: Abinitio ETL tool (3.1.2.2), ACE, BRE, Unix (AIX) with shell scripting, Oracle 10g as Database.

Confidential

Offshore ETL Designer & Project Lead

Responsibilities:

  • Analyze requirements and created the System Design Specification (SDS).
  • Create the Design framework using Abinitio ETL Tool and generic utilities available in the project.
  • SDS was forwarded to development team and I used to resolve their technical queries.
  • I worked as an interface between Onshore and Offshore team members.
  • I used to prepare all the project artifacts like Estimation sheet, Weekly Status Report, Milestone report, Metrics Report, Traceability Matrix etc.

Environment: Abinitio ETL tool, Unix (AIX) with shell scripting, Mainframe as Source and DB2 as Database.

Confidential

Onshore Business & System Analyst, Onshore ETL Lead, Offshore ETL Developer

Responsibilities:

  • Understand the requirement and write the Technical Specification document and Data mapping.
  • Helping the team in resolving the technical and requirement queries.
  • Attend all the review/status meetings and complete all adhoc-requests raised by client.
  • Prepare different project artifacts like Status Report, Access Tracker, Defect Tracker etc.
  • I was working as an ETL developer in a team and was involved in development, enhancement and testing of different graphs.
  • I was writing wrappers to copy files from remote location, running graphs and loading into database.
  • I was moved to onshore based on my good performance and appreciation from client.

Environment: Abinitio ETL tool, Unix (AIX) with shell scripting and DB2 as Database.

We'd love your feedback!