We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume

Mclean -, VA

SUMMARY

  • 5+ years of IT experience in Analysis, Design, Development, Implementation, Testing and Support of Data Warehousing, Data Integration Solutions and expertise in performing data analysis.
  • Strong Data Warehousing ETL experience of using Informatica 9.6.1/9.5.1/8.6.1/8.5/8.1/7.1 PowerCenter Client tools (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases).
  • Experience in development and design of various scalable systems using Hadoop technologies in various environments. Extensive experience in analyzing data using Hadoop Ecosystems including HDFS, MapReduce, Hive & PIG
  • Practical experience with the data modeling dimensional, relational, star schema's, snowflake schema modeling with facts and dimensions.
  • Experience in Performing root cause analysis on all processes and resolve all production issues and validate all data and perform routine tests on databases and provide support to all ETL applications.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server and Oracle PL/SQL.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle11g / Oracle10g/9i, MS SQL Server, DB2, Teradata, and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Experience with dimensional modeling using star schema and snowflake models.
  • Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
  • 2+ years’ experience using predictive modeling, data processing, and data mining algorithms to solve challenging problems using Python Programming.
  • Exposure to building data models and applying learning algorithms in both supervised and semi - supervised learning projects.
  • Good Hands on Experience in training Machine Learning models (Scikit-learn, TensorFlow, Keras) and Cleaning, blending and visualizing data (Pandas, Spark, Matplotlib).
  • Hands on experience in Amazon Web Services ( EC2, S3, RDS, Elastic Load Balancing, SQS, Identity and access management, AWS Cloud Watch, EBS and Amazon CloudFront ).
  • Design, Develop and Implement Jobs to trigger UNIX Shell scripts for importing the data from source system and bringing data into HDFS through AWS S3 storage.
  • Strong software engineering skills including experience of a modern agile development workflow using Git , unit testing and CI/CD pipeline using Jenkins .
  • Worked with Docker container snapshots, attaching to a running container, managing containers, directory structures.
  • Performed unit testing using JUnit and Mockito aiding test driven Development in some scenarios.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter 9.5.1/9.1/8. x/7.1.

Hadoop/Big-data Technologies: HDFS, Map reduce, Hive, Pig, Sqoop, Scala, Spark, Kafka

Databases: Oracle 11g/10g/9i/8i, Teradata V2R5/R6/R12/R13, SQL Server 2008/2005, DB2 9.5

Loaders: Teradata Fast Load FLOAD, Multi Load MLOAD, Fast Export, Tpump, TPT, BTEQ, SQL Loader

Languages: ANSI SQL, T-SQL, PL/SQL, Stored Procedures, UNIX Shell Scripts, Java, XML

Other Tools: SQL plus, Teradata SQL Assistant, Oracle SQL developer, PLSQL developer, Web Services, MS Office, MS Visio, TOAD, FTP, SVN, Putty, WinSCP, HP QC

Operating Systems: Windows NT,2000/03/XP/Vista/7, Linux Red Hat, Unix AIX 5.0/5.2/6.0, Solaris

Scheduler Tools: Autosys, Control M, CA7, Informatica Scheduler, DataStage director.

Data Modeling: Erwin r7.3/4/3.5, MS-Visio 2010/2007

Programming Skills: C++, Unix Shell Scripting, PL/SQL, PERL, FORTRAN, Python, JAVA (Eclipse IDE and Net Beans IDE), HTML, JAVA Script, J2EE, CSS.

PROFESSIONAL EXPERIENCE

Confidential, McLean - VA

ETL/Informatica Developer

Responsibilities:

  • Analyze and understand the Technical Specification Document (Source to Target matrix) for all mappings and clarify the issues with Data Architects.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.6.1.
  • Extensively used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager.
  • Extracted data from various source systems like Oracle, SQL Server and DB2 to load the data into Landing Zone and then by using Java copy command loaded into AWS-S3 Raw Bucket .
  • Created sessions and batches to run the mappings and set the session parameters to improve the load performance. Developed Mapplet, reusable transformations.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Good experience in integrating existing Informatica power center ETL using multi-dimensional data modelling with Snowflake schema.
  • Involved in the data analysis for source and target systems and good understanding of Data Warehousing concepts, staging tables, Dimensions, Facts and Star Schema, Snowflake Schema.
  • Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Experience in integrating external business application with Informatica MDM hub using Batch process.
  • Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level and the Target Level for Slowly Changing Dimensions Type1, Type2 for Data Loads.
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Pre and post session assignment variables were used to pass the variable values from one session to other.
  • Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
  • Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
  • Building Reports according to user Requirement.
  • Strong experience in writing UNIX shell scripts for data file validation and working with Informatica components in Unix environment .
  • Prepared Unit/ Systems Test Plan and the test cases for the developed mappings.

Environment: Informatica Power Center 9.6.1, Oracle 11g, DB2, SQL Server, SQL Loader, SQL developer, PLSQL developer, Putty, PL/SQL, UNIX Shell Scripting, AWS S3.

Confidential

ETL/ Informatica Developer

Responsibilities:

  • Used Informatica power center 9.5.1 to Extract, Transform and Load data into Oracle data Warehouse from various sources like flat files, xml files, Oracle tables, DB2 tables etc.
  • Designed various mappings using transformations like Source Qualifier, Normalizer, Expression, Filter, Router, and Update strategy, Stored Procedure, Sorter, Lookup, Aggregator and Joiner in the mappings
  • Developed mappings/reusable objects/transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center.
  • Using different types of tasks like decision, timer and event initial/bulk load process wait and command to maintain the dependencies between the sessions and load balancing within informatica workflow.
  • Working with various transformations like router, expression, SQL, lookup, normalizer, and aggregator in building complex mapping.
  • Wrote Oracle packages and functions to implement business rules and transformations after staging is done.
  • Worked with pre and post sessions, and extracted data from transaction system into staging area.
  • Using Oracle SQL developer, TOAD, Teradata SQL Assistant to analyze the existing data and design complex SQL queries.
  • Working on golden copy of master data management tool of informatica (Informatica MDM).
  • Expertise in the optimization of the performance of the designed workflows processes in informatica and to identify the bottlenecks in different areas after the full volume system run.
  • Coding UNIX shell scripts and Perl scripts to automate the data loading tasks.
  • Developing mappings based on Change Data Capture CDC techniques.
  • Using Oracle SQL developer, PLSQL Developer, TOAD, Teradata SQL Assistant to analyze the existing data and to design SQL queries for mappings.
  • Performed code reviews with peers and created unit test plan document.
  • Created test cases for Unit test, System Integration test and UAT to check the data quality.

Environment: Informatica Power Center 9.5.1, Oracle 10g, DB2, SQL Server, SQL Loader, SQL developer, PLSQL developer, TOAD, Putty, PL/SQL, UNIX Shell Scripting, Flat Files, Control-M, FTP.

Confidential

Programmer Analyst

Responsibilities:

  • Participated in all phases of SDLC from Requirements gathering, Design, Development, testing, Production, user training and support for production environment.
  • Understood the ETL specifications and built the ETL applications like Mappings on daily basis by fetching the daily data from various sources like relational and flat file sources of OLTP systems.
  • Created complex mappings, sessions as per requirement of business to implement the logic to load data into staging area using Informatica PowerCenter 8.6.1.
  • Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
  • Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
  • Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
  • Implemented slowly changing dimension methodology for accessing the full history of accounts.
  • Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
  • Extracted the data from Oracle , DB2 , CSV and Flat files.
  • Write Shell script running workflows in UNIX environment.
  • Prepared migration document to move the mappings from development to testing and then to production repositories.
  • Experience in working in job scheduling tool Tidal to monitor execution of jobs in all environments.
  • Performed the Unit Testing, System Integration testing on developed code in all environments.
  • Optimizing performance tuning at source, target, mapping and session level.
  • Provided support for initial days when code moved to production.

Environment: Informatica 8.6.1, Oracle 11g, SQL Server 2008, Teradata, UNIX, Tidal.

Hire Now