Data Warehouse Resume
SUMMARY
- Amazon Web services (AWS) cloud Practitioner
- Google Could Platform (GCP) Data Engineer Certified
- Teradata Certified Professional
- Datastage 8.5 Certified Professional
- 10+ years of experience in Information Technology in analyze, development, implementation in Data warehouse life cycle implementation
- Overall, a decade of experience in on ETL Design, Development and Testing on various projects using ETL tools like IBM Datastage, Talend Data Integration.
- Around 5 years of experience with Talend, with an expert level noledge on Talend Studio Versions 7.x & 6.x.
- Extensively worked on the cloud platforms like AWS & Azure, while has a professional certification & noledge on Google cloud platforms (GCP).
- Mainly worked in the areas of ETL Development, Testing and SQL.
- Extensive experience in creating Data Lake for multiple purposes.
- Good Exposure on Data injection, Data Prep, Data catalog and ETL project Migration
- Expert level noledge on ELT Talend studio
- Expertise on Databases like Teradata, Oracle, Netezza, My Sql, Aurora DB and Azure Synapse
- Extensively worked on Unix/Linux commands and scripts.
- Worked on scheduling tools like Autosys, Control M, Zena and TAC
- Worked on DataStage administrator activities like - project creation, start & stop the DS services and admin related issues resolving etc
- Well Trained in Python, Qilkview, Qlik Sense, Big Data and AWS.
- Good Exposure on AML (Anti money laundering compliance) and KYC.
- Good noledge on Banking, Credit Risk, Risk Mitigant, AML (Anti money laundering) Compliance, KYC and Sales & services domains.
TECHNICAL SKILLS
- AWS cloud Practitioner
- Google Cloud Platform Training
- Datastage8.5
- Teradata Certified Professional
PROFESSIONAL EXPERIENCE
Data Warehouse
Confidential
Responsibilities:
- Gather requirements from business.
- Analyze, design and develop ETL (extraction transformation and loading) data pipelines according to business requirements.
- Develop ETL code to integrate source transaction system data to Azure cloud environment in Talend Open studio using various stages like tsap components(read the data from SAP ),tazure storage (storage in Azure cloud),tmap ( for transformation)
- Database performance tuning and system optimization. write analytical Database sql queries to retrieve the data from cloud Azure Synapse to provide answers to business questions build detailed ETL design document for ETL development activities for team
- Built the ETL data pipeline for data preparation & Data injection in Talend open studio
- Build Data warehouse and data lake in Amazon web services Cloud for multipurpose busines usage
- Develop and run the Unix script to source and target file processing and batch scheduling.
- Unit test developed data pipeline code, perform the SIT (system integration testing) and conduct UAT (User acceptance testing) with business user.
Environment: Talend Studio7.2, AWS S3, Azure Blob storage, Linux, Azure synapse, My SqlProject: AML (Anti Money Laundering)
Confidential
Senior Software EngineerResponsibilities:
- Interacting with source systems to get the business requirements.
- Giving solutions to the business based on the requirements.
- Analyzed the FSD and prepared the TSD accordingly.
- Developed Talend jobs using various stages like Tmap, Tjava, Taggregator stages
- Developed jobs using various stages -Sequential File, Transformer, Funnel, Join, Lookup, filter, remove duplicate, sort stages.
- Performance monitoring and system optimization
- Gathering information and Provide to team for design/development.
- Supporting to external testing team in SIT.
Environment: Datastage 11.5,Talend 6.4, Windows XP, Unix, Netizza, Oracle, My Sql, Zena as Job Scheduler.
Confidential
Data EngineerResponsibilities:
- Interacting with source systems to get the business requirements.
- Giving solutions to the business based on the requirements.
- Analyzed the FSD and prepared the TSD accordingly.
- Developed Talend jobs using various stages like Tmap, Tjava, Taggregator etc.
- Developed best visualization to customer
- Developing reports in QlikView
- Performance monitoring and system optimization
- Using RTC for export/import datastage objects on regular basis
- Gathering information and Provide to team for design/development.
- Supporting to external testing team in SIT.
Environment: Datastage 11.5,Talend 6.4, Windows XP, Unix, Netizza, Oracle, My Sql, Zena as Job Scheduler.
Confidential
Data EngineerResponsibilities:
- Interacting with source systems to get the business requirements.
- Giving solutions to the business based on the requirements.
- Analyzed the FSD and prepared the TSD accordingly.
- Developed jobs using various stages -Sequential File, Transformer, Funnel, Join, Lookup, filter, remove duplicate, sort stages.
- Using RTC for export/import datastage objects on regular basis
- Gathering information and Provide to team for design/development.
- Supporting to external testing team in SIT.
- Leading the team with 3 resources.
Environment: Datastage 8.5, Windows XP, Unix, Teradata Sql Asst, RTC code, Control M Job Scheduler.
Confidential
Data EngineerResponsibilities:
- Interact with different source system teams and Moody’s team for data flow.
- Traveled regularly offshore to Onsite for gathering the requirements based on the country specific and to discuss with businesspeople.
- Analyzed the FSD and prepared the TSD accordingly.
- Using GIT for export/import data stage objects on regular basis
- Running job Sequences
- Supporting to external testing team in SIT team.
- Leading the team with 9 resources.
Environment: Datastage8.5, Windows 7, OBIEE-11g, Unix,Control M Job Scheduler, GIT.
Confidential
Data EngineerResponsibilities:
- Developed and Modified data stage jobs in DS Parallel Environment
- Used Data stage stages like sequential file stage, SCD,Oracle enterprise, transformer, copy, modify, join, lookup, data set, sort, Pivot, surrogate key gen, remove duplicate.
- Developed and modified data stage job sequences in DS parallel environment
- Apart from basic DataStage 8.5 admin activities like - project creation, starting/stopping server, admin related issues solving etc.
- Mentoring 3 resources
- Coordinating with onsite team during testing.
- Identifying the locks in DS projects and releasing them
- Jobs triggered over through Autosys WCC tool(like force start, ON Hold, Off Hold, Success Mark )
Environment: Datastage 7.5.x2, 8.1 & 8.5, Windows XP, Oracle9i, Unix, Autosys, SVN.
Confidential
Data EngineerResponsibilities:
- Designed and developed ETL processes using Datastage designer to load data from Oracle, SQL Server, flat files to staging database and from staging to the target Data Warehouse database.
- Involved in performance tuning and optimization of Datastage Jobs using features like Parallelism and data/index cache to manage very large volume of data.
- Involved in admin activities such as restarting DS engine, Creating projects
- Analyze and halping in preparing the source to target mapping documents.
- Documented ETL test plans, test cases and validations based on design specifications for unit testing, system testing, functional testing, prepared test data for testing, error handling and analysis.
- Ensuring ETL Best Practices are followed and peer review of the job designs.
Confidential
Data EngineerResponsiblities:
- Developed and Modified data stage jobs in DS Parallel Environment
- Used Data stage stages like sequential file stage, Oracle enterprise, transformer, copy, modify, join, lookup, data set, sort, Pivot, surrogate key gen, remove duplicate.
- Developed and modified data stage job sequences in DS parallel environment
- Unit testing and Test case scenario preparation
- Unit Testing, Integration testing, production implementation
- Schedule to run jobs at specific date and time, Schedule to run on regular Schedule Daily, Weekly, etc...
- Coordinating with onsite team during testing.
- Understand data warehouse life cycle implementation
- Analyze ETL design documents and physical data model
- Developed data stage jobs, job sequences
- Meta data management, data set management
- Extensively used different stages Sequential File, Transformer, Funnel, Join, Lookup, ODBC, sort, filter, remove duplicate
- Export/import data stage objects on regular basis
- Running job Sequences
- Unit Testing, Integration testing, UAT Validation