Aws Etl Developer/sr Informatica Developer Resume
SUMMARY
- 9+ years of experience in Information Technology with a strong background in Analyzing, Designing, Developing, Testing and Implementation of Data warehousing applications in various verticals such as Health care, Retail and AWS platform. Hands on experience on building cloud DWH like AWS S3 - Redshift.
- Experience in defining project scope, requirements gathering from business users and system design (functional and technical specifications).
- Experience in Data Modeling, Star/Snowflake schema modeling, Fact and Dimension Tables, Physical and Logical data modeling using Erwin Data modeling tool.
- Strong understanding of Data Comprehension of Data warehousing Methodologies like Bill Inmon & Ralph Kimball approaches.
- Hands on experience on reporting tools like OBIEE, Cognos and Business Objects.
- Extensive experience in using Informatica Power Center 9.6/9.1//8.6.1/8.5/8.1/7. x/6.x, Power Exchange, Power Mart 5.x/4.x, IDQ (Informatica data quality), Informatica MDM (Master data management).
- Extensively used Informatica Client tools like Designer, Workflow Manager, Workflow Monitor, Repository Manager and Server tools - Informatica Server, Repository Server
- Extensive experience in debugging mappings. Identified bugs in existing mappings/workflows by analyzing the data flow and evaluating transformations.
- Experience in designing/developing complex mapping using transformations Connected and Unconnected Lookup, Source Qualifier, Joiner, Expression, Filter, Sorter, Aggregator, Router, Update Strategy, Stored Procedure, Sequence generator, Re-usable transformations.
- Proficient in the integration of various data sources involving multiple relational databases like Oracle11g/10g/9i, MS SQL Server, DB2, COBOL files and XML Files, Flat Files (fixed width, delimited), into the staging area of ODS, Data Warehouse or Data Mart.
- Extensively worked on PL/SQL Packages, Procedures, cursors, Functions and Triggers and Creation of Schema Objects- Tables, Indexes, Constraints, Sequence, Synonyms, Triggers, Views, Inline views etc.
- Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
- Experience in implementing update strategies, incremental loads, incremental aggregation and change Data capture.
- Experience in identifying performance bottlenecks and tuning of Informatica sources, targets, mappings, Transformations, and sessions for better performance.
- Excellent skills in using versioning control in Informatica.
- Excellent skills in working with UNIX shell scripts to load the files into Informatica source file directory and to securely encrypt and transfer the files using SFTP and worked with Pre-Session and Post-Session UNIX scripts for automation of ETL jobs using AutoSys, Appworx and CONTROL-M schedulers and Involved in migration/conversion of ETL processes from development to QA and QA to Production environment.
- Experience in providing 24/7 Production Support.
- Extensive knowledge of Software Development Life Cycle (SDLC), having thorough understanding of various phases like Requirements, Analysis/Design, Development and Testing.
- Experience in unit testing in Informatica Power Center.
- Organized, flexible and a quick learner with the ability to multi-task and work independently or in a team environment.
TECHNICAL SKILLS
Data Warehousing: Informatica Power Center 10.1/9.1/8.6.1/8.5/8.1 , Power exchange, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager, Workflow Monitor, OLTP, OLAP, AWS, S3.
Dimensional Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, ERWIN 4.5.
Databases: Oracle 11g/10g/9i/8i, RedShift, Microsoft SQL Server, IBM-DB2.
Programming GUI: C, C++, SQL, PL/SQL, Java, PYTHON, SQL Plus, XML.
Shell Scripting: Unix Shell Scripting, Korn shell scripting.
Other Tools: SQL * PLUS, TOAD, PL/SQL Developer and Putty.
Operating Systems: WINDOWS 95/98, WINDOWS- NT/2000/XP Professional, MS-DOS, UNIX, Sun Solaris and HPAIX.
PROFESSIONAL EXPERIENCE
Confidential
AWS ETL Developer/Sr Informatica Developer
Responsibilities:
- Work closely with MOCS application services and operations team on ETL development efforts including analysis and design of integration solutions, data and reporting needs of internal and external stakeholders, and enhancement related to applications and services.
- Work with external tables in Redshift to load the file from S3.
- Hands on experience using AWS Services Subnets EC2, S3, CloudFront, SNS, SQS, RDS, IAM, CloudWatch and CloudFormation focusing on high availability.
- Work with Matillion ETL tool in AWS to create SCD I and SCD II mapping to load data into Redshift.
- Hands on experience on Amazon Event bridge, AWS step functions, Elastic search.
- Hands on experience working on MPP databases like Redshift
- Performance tuning and support of ETL, Database jobs.
- Work on Code commit for source control and versioning.
- Work with Unix scripts to encrypt and FTP the files to S3 bucket.
- Work on daily jobs to move the files from remote host to S3 environment.
- Design Stage jobs to pick the data files from S3 to redshift staging layer.
- Design transformation jobs from stage table to Redshift target tables as per the requirement for TYPE I and Type II
- Work on create Python Script to start and stop the EC2 instances in AWS.
- API calls from Matillion to download files and parse from Docusign
- Work with Informatica ETL tool to develop SCD I and SCD II mapping.
- Work on Informatica on Cloud in AWS for POC to build source to Target mapping.
- Develop Informatica ETL code to load data in DWH tables.
- Testing of Informatica ETL jobs developed.
- Tuning of Informatica ETL jobs.
Environment: Informatica Power Center 10.1, Matillion, Oracle 12, TOAD for Oracle 12, Erwin 10, UNIX Shell Scripting, AWS, Informatics Cloud, Redshift, S3, SQL Workbench, PYTHON, PUTTY, Cognos, Cron tab for scheduling.
Confidential, NJ
Sr. Informatica Developer
Responsibilities:
- Involved in gathering, analyzing and documenting business requirements and functional requirements and data specifications from users and transformed them into technical specifications.
- Extracted data from various source’s and load into tables.
- Based on the requirements, used various transformations like Source Qualifier, Expression, Filter, B2B, Router, Update strategy, Sorter, Lookup, Aggregator and Joiner in the mapping.
- Implemented incremental loads, Change Data capture and Incremental Aggregation
- Extensively written complex SQL’s to extract data from source’s and load into the target tables.
- Identified performance bottlenecks and Involved in performance tuning of sources, targets, mappings, transformations and sessions to optimize session performance.
- Developed SCD Type I mappings.
- Developed UNIX shell scripts to transfer files, archive files.
- Developed UNIX shell scripts to validate header, trailer, and validate source files before extracting.
- Scheduled Informatica WF’s using Autosys.
- Practiced Agile methodology while implementing projects.
Environment: Informatica Power Center 10.1, Oracle 12, TOAD for oracle 12, Erwin 10, UNIX Shell Scripting, PUTTY, PL/SQL, Cognos, Autosys scheduling tool.
Confidential, Chicago
Sr. Informatica Developer
Responsibilities:
- Extracted data from various sources like flat files, XML files, oracle and loaded into tables.
- Worked on Informatica 10.1 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
- Based on the requirements, used various transformations like Source Qualifier, Expression, Filter, B2B, Router, Update strategy, Sorter, Lookup, Aggregator and Joiner in the mapping.
- Created complex mappings using the Mapping designer, respective workflows and worklets using the Workflow manager.
- Troubleshooted the mappings using the Debugger and improved the data loading efficiency using Sql-overrides and Look-up Sql overrides.
- Developed SCD I and SCD type II mappings using MD5 logic.
- Implemented incremental loads, Change Data capture and Incremental Aggregation
- Created UNIX shell scripts to encrypt and decrypt the files, move the files, archive the files and split the files.
- Developed UNIX Shell Scripts and SQLs to get data from Oracle tables
- Created stored procedures, functions and triggers to load data into summary tables.
- Extensively written complex SQL’s to extract data from source’s and load into the target tables.
- Implemented parallelism in loads by partitioning workflows using Key Range partitioning.
- Worked on MDM manual maintenance to add, delete, update or merge data.
- Experience in MDM implementation including data profiling, data migration and pre landing processing.
- Created Informatica Data Quality Services like (Data Integration service, Analyst service & Content Management Service) and experience in using Informatica Developer and AnalystTool.
- Experience in Informatica Administration creating repository services, Integration services and hands on Admin console, command line utilities in Informatica.
- Practiced agile methodology while strategizing and implementing solutions
- Scheduled Informatica Workflow’s using TWS scheduling tool.
Environment: Informatica Power Center 10.1, Oracle 12, TOAD for oracle 12, Erwin 10, UNIX Shell Scripting, IDQ, PUTTY, PL/SQL, MDM 10.1, Cognos, TWS scheduling tool.