Etl Informatica Developer/data Analyst Resume
San Antonio, TX
SUMMARY:
- 7 years of IT experience in Data Warehousing technology and all phases of Software Development Life Cycle (SDLC) including Business Requirement Analysis, Application designing, Development, Implementations and Testing of Data warehousing and Database business systems for Retail, Healthcare, Financial, Insurance domains.
- 7 years of experience in designing and development of ETL Methodology using Informatica PowerCenter 10, 9.6.1/9.0.1 /8. x, Informatica PowerExchange 10.1.0, 9.6.1/9.0.1 , Informatica Data Quality 10, 9.6.1.
- 5+ years of experience in working with ORACLE 11g/10g, PL/SQL, Netteza and tuning.
- 5+ years of experience in Data Quality, Profiling, Validations, reference check and exception handling using Informatica Data Quality.
- Experience in Bigdata technologies and integration using Hadoop, HDFS, IMPALA and Hive.
- 3+ years of experience in Change Data Capture (CDC) Methodology using PowerExchange 9.0/8. x.
- 3+ years of experience in Teradata 14/13 SQL and utilities like MLOAD, Fast Load, TPT.
- Experience in working with cloud environment using AWS - EC2 instances, S3 buckets, RDS, SNS, CloudWatch
- I nteracted with end-users and functional analysts to identify and develop Business Requirement Document (BRD) and transform it into technical requirements.
- Worked extensively in design and analysis of Requirements, Development, Testing and Production support.
- Demonstrated experience developing, testing, and deploying ETL and/or CDC solutions (utilizing Informatica PowerCenter and/or PowerExchange)
- Experience in best practices design of CDC implementations in a high volume environment, including CDC software, database and tuning.
- Prepared various types of documents like requirements gathering, ETL specification, Data Mapping, Test cases, Data dictionary etc.
- Expert in writing optimize SQL query using ORACLE, SQL Server, Teradata, MySQL and Hive.
- Exposure in writing SQL using analytical functions like Ranking Functions, Reporting Aggregate Functions, LAG/LEAD Functions, FIRST/LAST Functions etc.
- Extensive knowledge of Informatica tuning and SQL/PL SQL tuning.
- Experience in integration of various data sources like Oracle, SQL server and sequential files into staging area.
- Experience in implementing platforms on AWS.
- Performed Unit testing, System testing, Integration testing and users for UAT and prepared test report in different phases of project.
- Experience in migrating Inforamtica systems into cloud server
- Created and modified UNIX Shell Scripts for ETL jobs.
- Extensive experience with Data Extraction, Transformation, and Loading (ETL) from Multiple Sources.
- Worked on data integration from various source files such as flat files, CSV files, Relational Databases etc. into a common analytical Data Model.
- Experience in Informatica Production support ; resolved hundreds of tickets where Data issues involved.
- Designed and developed complex mappings, Mapplets, tasks and workflows and tuned it.
- Experience in Debugging and Performance tuning of targets, sources, mappings and sessions in Informatica.
- Moderate experience in Informatica Change Data Capture Management
- Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations, Mapplets and PL/SQL stored procedures.
- Extensively used Slowly Changing Dimension (SCD) technique in business application.
- Conducted Unit tests, Integration tests and Customer Acceptance tests (UAT).
- Experience in working with MicroStrategy developed Dashboard and Scorecard Reports.
- Expertise in OLTP/OLAP Analysis, E-R modeling and dimension modeling.
- Developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling.
- Have understanding and Knowledge about Information Technology Infrastructure Library(ITIL)
- Working experience in Agile and Waterfall methodologies.
- Excellent communication skills, ability to communicate effectively with executive and management team, having strong analytical, problem solving skills.
TECHNICAL SKILLS:
ETL TECHNOLOGY: Informatica PowerCenter 10, 9.X and 10.X Cloud/9.x/8.x/7.x, Informatica PowerExchange 10.1.0, 9.x, IDQ 10, 9.x
DATA WAREHOUSE: Star Schema, Snowflake schema and Multidimensional Modeling, Development
DATA MODELLING: MS Visio, Erwin 8/7.1
DATABASES: Oracle 11g/10g, MS SQL Server 2012/7.0/2000 , MS Access, Sybase, DB2, MySQL, Teradata 14/13, Netezza
PROGRAMMING: C, C++, SQL, PL/SQL, HTML, CSS, UNIX Scripting
TOOLS: Quest TOAD, SQL*PLUS, SQL*Loader, SQL*Net, SQL Navigator Export/Import, Oracle Discoverer 10gOPERATING SYSTEMS: Windows 10, Windows 98/NT/2000/XP, AIX, Sun Solaris, UNIX, MS-DOS
APPLICATIONS: MS Office, MS Project, FrontPage, Toad 9.2/8.6, Basecamp, Rally
PVCS,WOTS,MS: Office 2007, MS Project
BIG DATA: Hadoop, Scoop, Hive, Impala
BI REPORTING: MicroStrategy 10.x,9.x, Oracle E-Business Suite
PROFESSIONAL EXPERIENCE:
Confidential, San Antonio TX
ETL Informatica Developer/Data Analyst
Responsibility:
- Provided technical knowledge/expertise to support the requirements definition, design, and development of data warehouse components/subsystems, specifically Extract-Transform-Load or Extract-Load-Transform technologies and implementations (PowerCenter) and/or Change Data Capture technologies and implementations (PowerExchange)
- Researched, designed, developed, and modified the ETL and related database functions, implementing changes/enhancements to the system
- Designed ETL processes and developed source to target data mappings, workflows, and load processes.
- Developed, tested, integrated and deployed ETL routines using ETL tools and external programming/scripting languages, as necessary.
- Administered and maintained the ETL environment, including performing configuration management activities related to the ETL environment
- Researched, designed, developed, and modify the Change Data Capture (CDC) and related functions
- Designed CDC processes and develop associated processes.
- Developed, tested, integrated and deployed CDC routines using CDC tools and programming/scripting, as necessary.
- Administered and maintained the ETL/CDC environment, including performing configuration management activities related to the ETL/CDC environment
- Participated as needed in the deployment of system upgrades/updates, software patches/releases, and configuration changes
- Created and maintained the Shell Scripts and Parameter files in UNIX for the proper execution of Informatica workflows in different environments.
- Set up Users and User accounts as per requirements.
- Actively Participated in Data Quality services and frameworks for Data Quality
- Considered configuration management, security, quality control in designs/implementations
- Resolved requests for assistance in troubleshooting issues assigned to the development team
- Supported functional/regression testing activities in support of system functionality releases, patches and upgrades
- Documented, tracked, and updated System Change Requests and Problem Reports
- Analyzed process improvement areas/recommend changes to processes/procedures for efficiencies/cost-savings/etc.
- Build creation, verification and deployment to QA and UAT using Transporter.
- Created batch scripts for automated database build deployment.
Environment: Informatica Power Exchange 10.1.0, Informatica PowerCenter 10/, Informatica Data Quality 10, Oracle 12c, Pl/SQL Developer, Toad, MySQL, Secure FX, Oracle E-Business Suite, ServiceNOW (SNOW), PVCS, WOTS.
Confidential, Fort Worth, TX
Sr. Informatica Developer/Data Analyst
Responsibility:
- Worked with business analyst for requirement gathering, business analysis and testing and project- coordination using interviews, document analysis, business process descriptions, scenarios and workflow analysis.
- Created Technical Design Document or Minor Release Document (MRD) from business requirements document (BRD) or Functional Requirement Document (FRD) business analyst based on business objectives, facilitated joint sessions.
- Analyzed business and system requirements to identify system impacts.
- Created flow diagrams and charts.
- Created the Detail Technical Design Documents which have the ETL technical specifications for the given functionality, overall process flow for each particular process, Flow diagrams, Mapping spreadsheets, issues, assumptions, configurations, Informatica code details, Database changes, shell scripts etc. and conducted meetings with the business analysis, clients for the Approval of the process.
- Analyzed the existing mapping logic to determine the reusability of the code.
- Handled versioning and dependencies in Informatica.
- Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
- Translated the PL/SQL logic into Informatica mappings including Database packages, stored procedures and views.
- Administered Informatica Powercenter and change data capture software which was used for data integration.
- Performed Admin Jobs and Tasks, reviewed logs, Deployed ETL Codes.
- Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
- Created and maintained the Shell Scripts and Parameter files in UNIX for the proper execution of Informatica workflows in different environments.
- Set up Users and User accounts as per requirements.
- Actively Participated in Data Quality services and frameworks for Data Quality
- Created UNIX scripts to read/write and ftp files from and to windows servers and UNIX.
- Created Unit test plans and did unit testing using different scenarios separately for every process. Involved in System test, Regression test and supported the UAT for the client.
- Performing ETL and database code migrations across environments using deployment groups.
- Populating the business rules using mappings into the target tables.
- Developed Informatica Data Quality Mappings, sessions, workflows, scripts and orchestration Schedules.
- Involved in end-to-end system testing, performance and regression testing and data validations.
- Worked extensively on modifying and updating existing oracle code including object types, views, PL/SQL stored procedures and packages, functions and triggers based on business requirements.
- Worked in agile minor release cycles the designated database developer.
- Unit test and support QA and UAT testing for database changes.
- Managed performance and tuning of SQL queries and fixed the slow running queries in production.
- Help support Data masking projects for DRD for all Dev, QA and UAT environments via Enterprise Data Obfuscation.
- Extensively worked on Teradata
- Performed POC to load Json data into AWS Cloud environment. Used S3 buckets RDS database and also loaded into SnowFlake Database.
Environment: Informatica PowerCenter 9.6.1/Cloud/BDE, Informatica Data Quality 9.6, Oracle 11g, pl/SQL Developer, MySQL, Teradata 14, Tecticia Client, Putty, SEAL, Subversion(SVN), JIRA, Transporter, ServiceNOW (SNOW), Enterprise Data Obfuscation(EDO)
Confidential, St. Petersburg, FL
Informatica Developer
Responsibility:
- Gathered report requirements with Business Users and converted them into project documentation.
- Arrived at the dimension model of the OLAP data marts in ERwin.
- Optimized/Tuned mappings for better performance and efficiency. Performance tuning of SQL Queries, Sources, Targets and sessions.
- Responsible for the Dimensional Data Modeling and populating the business rules using mappings into the Repository for Meta Data management. Understand the business needs and implement the same into a functional database design.
- Maintain Development, Test and Production mapping migration Using Repository Manager. Also, used Repository Manager to maintain the metadata, Security and Reporting. Modeling and populating the business rules using mappings into the Repository for Meta Data management.
- Worked on System Analysis and Design of process/data flow.
- Worked on Teradata SQL and Utilities like MLOAD, FASTLOAD.
- Populating the business rules using mappings into the target tables.
- Translated the PL/SQL logic into Informatica mappings. Extensively Worked on Database stored procedures and views.
- Configured Informatica Real time workflows to capture CDC.
- Locate data rapidly with flexible object filtering techniques to reduce errors and speed development with a point-and-click interface using PowerExchange.
- Created Informatica PowerExchange Data Map and Registration on Mainframe source files and tables and importing them to Informatica shared folder.
- Used Workflow manager to create and configure workflow and session task to load data. Used Informatica workflow monitor to create, monitor workflow in case of process failures. Performed testing as per UTC.
- Implemented optimization techniques for performance tuning and wrote Pre-and Post-session shell scripts.
- Configured the sessions using workflow manager to have multiple partitions on source data and to improve performance.
- Involved in end-to-end system testing, performance and regression testing and data validations.
- Worked on Data Extraction, Transformations, Loading, Data Conversions and Data Analysis.
Environment: Informatica PowerCenter 9.5/9.1, Power Exchange 9.5/9.1, Teradata 13, Oracle 11g, SQL Server 2012, Erwin 8, Putty, Sun Solaris. Rally, Basecamp, WinSCP, and SCM.
Confidential, Washington DC
Informatica Developer
Responsibility:
- Designed and developed mappings, Mapplets and sessions from source to target database using Informatica PowerCenter.
- Worked with various file formats like fixed length and delimited.
- Used Debugger extensively to validate the mappings and gain troubleshooting information about data and error conditions.
- Developed various mappings using Mapping Designer and worked with Source qualifier, aggregator, connected unconnected lookups, Filter transformation, and sequence generator transformations.
- Extensively used Informatica to transfer data from different source system and load the data into the target database.
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data Warehouse.
- Developed number of complex Informatica mappings, Mapplets and reusable transformations to implement the business logic and load the data incrementally.
- Used Debugger to track the path of the source data and to check the errors in mapping.
- Handled slowly changing dimensions of Type 1 and Type 2 to populate current and historical data to dimensions and fact tables in the Data Warehouse.
- Developed Informatica mappings by usage of aggregator, SQL overrides in lookups, source filter and source qualifier and data flow management into multiple targets using router transformations.
- Extensively involved in performance tuning of the ETL process by determining the bottlenecks at various points like targets, sources, mappings, sessions or systems. This led to a better session performance.
- Document the process for further maintenance and support. Worked on test cases and use cases.
Environment: Informatica PowerCenter 8.6, SQL Server 2008, Oracle 11g, UNIX, Sun Solaris, Harvest, Remedy, Jira, Putty, Winscp, BaseCamp, Erwin 8.5, Toad.