Sr. Informatica Developer Resume
St Louis, MO
PROFESSIONAL SUMMARY:
- Around 5 years of IT experience in all phases of Software Development Life Cycle (SDLC) which includes User Interaction, Business Analysis/Modeling, Design, Development, Integration, Planning, testing and documentation in data warehouse applications, ETL processing and distributed applications.
- Experienced and have leading ability working with On - site<>Offshore projects.
- Excellent domain knowledge of Banking Financial, Manufacturing, Insurance and Food Production.
- Strong expertise in using ETL Tool Informatica Power Center 8.x/9.x/10.0, Informatica Intelligent Cloud Services (IICS) Designer, Workflow Manager, Repository Manager and ETL concepts.
- Extensive experience with Data Extraction, Transformation, and Loading (ETL) from disparate data sources like Multiple Relational Databases (Teradata, Oracle, SQL Server, DB2) and Flat Files
- Worked with various transformations like Normalizer, Expression, Rank, Filter, Aggregator, Lookups, Joiner, Sequence Generator, Sorter, Update strategy, Source Qualifier, Transaction Control, JAVA, Union, CDC etc.,
- Worked with Teradata utilities like Fast Load, Multi Load, Tpump, Teradata Parallel transporter and highly experienced in Teradata SQL Programming.
- Experienced in Teradata Parallel Transporter (TPT). Used full PDO on Teradata and worked with different Teradata load operators.
- Designing and developing Informatica mappings including Type-I, Type-II slowly changing dimensions (SCD).
- Validating data files against their control files and performing technical data quality checks to certify source file usage.
- In depth data modeling knowledge in Dimensional Data modeling, Star Schema, Snow-Flake Schema, FACT and Dimensions tables.
- Experienced in writing SQL, PL/SQL programming, Stored Procedures, Package, Functions, Triggers, Views, Materialized Views.
- Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
- Very strong knowledge on end to end process of Data Quality requirements and its implementation.
- Good experience in writing UNIX shell scripts, SQL scripts for development, automation of ETL process, error handling and auditing purposes.
- Coordinating with Business Users, functional Design team and testing team during the different phases of project development and resolving the issues.
- Experience in performance tuning of Informatica mappings and sessions to improve performance for the large volume projects.
- Excellent communication and interpersonal skills, self-motivated, quick learner, Team leader and Team Player.
TECHNICAL SKILLS:
ETL Tools: Informatica PowerCenter 10.1/9.x/8.x, Informatica Intelligent Cloud Services (IICS)
Languages: SQL, PL/SQL, HTML, XML, Perl, UNIX Shell Scripting, Python Scripting
Methodology: Agile, SCRUM, Waterfall.
Databases: Oracle 11g/10g, SQL Server 2012/2008, DB2, Teradata 15/14, UDB DB2, AWS Redshift
Databases utility: AWS S3 bucket, AWS Redshift, Teradata SQL Assistant, Fastload, Mload, BTEQ, F-Export, TPump, TPT.
Operating Systems: Windows 10,UNIX, Linux.
Modelling Tool: Erwin 9.1, MS Visio
Control: M, Autosys, Tivoli Workload Scheduler
Reporting: Tableau 9.2, Cognos 10, Power BI, Shiloh
Others Tool: JIRA, Notepad++, MS office, T-SQL, TOAD, SQL Developer, XML Files, Control - M, Autosys, GitHub, ORACLE ERP, PUTTY, SharePoint, PL/SQL Developer, TOAD, Teradata SQL Assistant
PROFESSIONAL EXPERIENCE:
Confidential, St.Louis, MO
Sr. Informatica Developer
Responsibilities:
- Strong Technical expertise in Informatica, Responsible for the project tasks assigned.
- In-Depth knowledge of Petcare food production item data, Pos data, store data, direct consumer data.
- Initiate escalation procedure for incidents based on the agreed upon timelines and tracks it to closure.
- Manage ADO queue effectively and allocate tasks to the team based on an allocation plan.
- Candidate with good communication skills and customer handling skills.
- Experience working with end clients and product owner, Scheduling meeting with product owner and on-site/offshore team members to get business requirements and allocate task to induvial based on the workload allocation.
- Application and Batch Monitoring during onsite timings and Addressing all the Batch failures.
- Provide permanent solution for all recurring tickets through problem management to stabilize the Batch process (P2, P3, P5 Tickets).
- Address all Adhoc task and publishing the on-time updates to customer Handling Planned and Unplanned outages and communicate to all business users on-time.
- Application Enhancement and development experience.
- Worked closely with DBA and Data Analyst on Data issues.
- Worked in agile team, responsible for creating user stories and subtask in Kanban Story Board.
- Worked with Shiloh Technologies to create and deliver Walmart & Sam’s club reports.
Environment: Informatica 10.1, SAP, Oracle DB2, Flat files, SQL, PL/SQL, DB Visualizer, Unix & Python scripting, FileZilla. Power BIRetailer BO portal.
Confidential, Portsmouth, NH
Sr. Informatica Developer
Responsibilities:
- Effectively analyze large amounts of data from data modelling perspective.
- Identify, analyze the source system data for data quality, data profiling and data governance.
- Responsible for BI/Data Warehouse data modeling with involvement in design, dimensional modeling and architecture.
- Responsible for developing ETL data mapping documents for source to target column mapping and business rules.
- Analyze and understand data relationships between entities.
- Building re-engineering models from already existing models of source.
- Responsible for defining architectures that meet security, scalability, high availability, and network management/monitoring requirements, in a hybrid environment.
- Optimization of SQL scripts by Indexing, Partitioning and de-normalization.
- Responsible for developing Unit test case documents.
- Responsible for creating connection with AWS S3 and Redshift through Informatica Intelligent Cloud services.
- Worked in agile team, responsible for creating user stories and subtask.
Environment: Informatica Intelligent Cloud Services, SAP HANA, AWS S3, AWS Redshift, SQL, PL/SQL, DB Visualizer, Python scripting, JIRA Task board.
Confidential, Detroit, Michigan
Sr. Informatica Developer
Responsibilities:
- Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user and support for production environment.
- Proactively involved in interacting with business users to record user requirements and Business Analysis.
- Outlined the complete process flow and documented the data conversion, integration and load mechanisms to verify specifications for this data migration project with help of IICS Tool.
- Parsing high-level design spec to simple ETL coding and mapping standards.
- Worked with PowerCenter Designer tools in developing mappings and Mapplets to extract and load the data from flat files and Oracle database.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Created multiple BTEQ scripts and Teradata utilizes like BTEQ, FASTLOAD, MLOAD, and FASTEXPORT.
- Created the design and technical specifications for the ETL process of the project.
- Used Informatica as an ETL tool IICS to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.
- Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.
- Worked with various complex mapping, designed slowly changing dimension Type1 and Type2.
- Maintained Development, Test and Production Mappings, migration using Repository Manager. Involved in enhancements and Maintenance activities of the data warehouse.
- Performance tuning of the process at the mapping level, session level, source level, and the target level.
- Implemented various new components like increasing the DTM Buffer Size, Database Estimation, Incremental Loading, Incremental aggregation, Validation Techniques, and load efficiency.
- Strong on Exception Handling Mappings for Data Quality, Data Cleansing and Data Validation.
- Created Workflows containing command, email, session, decision and a wide variety of tasks.
- Tuning the mappings based on criteria, creating partitions in case of performance issues.
- Tested End to End to verify the failures in the mappings using shell scripts.
- Performed data validation after the successful End to End tests and appropriate error handling in ETL processes.
- Developed Parameter files for passing values to the mappings for each type of client
- Scheduled batch and sessions within Informatica using Informatica scheduler and wrote shell scripts for job scheduling.
Environment: Informatica PowerCenter 10.2, IICS, Cognos 10, Linux, SQL, PL/SQL, Oracle 11g, Teradata 15, TOAD, SQL Server 2012, ORACLE 11g, Control M, Shell Scripting
Confidential, Costa Mesa, CA
Sr. Informatica Developer
Responsibilities:
- Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
- Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
- Involved in extracting the data from the Flat Files and Relational databases into staging area.
- Mappings, Sessions, Workflows from Development to Test and then to UAT environment.
- Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
- Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.
- Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse.
- Developed several reusable transformations and mapplets that were used in other mappings.
- Prepared Technical Design documents and Test cases.
- Implemented various Performance Tuning techniques.
- Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.
- Performed data integration and lead generation from Informatica cloud into Salesforce cloud.
- Created summarized tables, control tables, staging tables to improve the system performance and as a source for immediate recovery of Teradata database
- Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various business needs of the transformations while loading the data into Data warehouse.
- Developed transformation logic as per the requirement, created mappings and loaded data into respective targets.
- Wrote hundreds of SQL query for data quality analysis to validate the input data against business rules.
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
- Extensively worked on Unit testing for the Informatica code using SQL Queries and Debugger.
- Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
- Improved performance testing in Mapping and the session level.
- Worked with UNIX shell scripts extensively for job execution and automation.
- Coordinated with Autosys team to run Informatica jobs for loading historical data in production.
- Documented Data Mappings/ Transformations as per the business requirement.
- Migration of code from development to Test and upon validation to Pre-Production and Production environments.
- Provided technical assistance to business program users and developed programs for business and technical applications.
Environment: Informatica PowerCenter 9.6, Informatica PowerExchange 9.6, SQL Server 2008, Shell Scripts, ORACLE 11g, SQL, PL/SQL, UNIX, Toad, SQL Developer, HP Quality Center, Cognos 9, T-SQL, Autosys
Confidential, San Francisco, CaliforniaInformatica Developer
Responsibilities:
- Created analysis of source systems, business requirements and identification of business rules.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes rising Informatica PowerCenter
- Worked on mappings, Mapplets and workflow to meet the business needs ensured they are reusable transformation to avoid duplications.
- Extensively used ETL to transfer and extract data from source files (Flat files and Netezza) and load the data into the target database.
- Documented Mapping and Transformation details, user requirements, implementation plan and schedule.
- Extensively used Control-M for Scheduling and monitoring.
- Involved in building tables, views and Indexes.
- Involved in ad hoc querying, quick deployment, and rapid customization, making it even easier for users to make business decisions.
- Designed and developed efficient Error Handling methods and implemented throughout the mappings. Responsible for Data quality analysis to determine cleansing requirements.
- Worked with several facets of the Informatica PowerCenter tool - Source Analyzer, Data Warehousing Designer, Mapping & Mapplet Designer and Transformation Designer. Development of Informatica mappings for better performance.
- Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level and the Target Level for Slowly Changing Dimensions Type1, Type2 for data Loads.
- Configured the sessions using workflow manager to have multiple partitions on Source data and to improve performance. Understand the business needs and implement the same into a functional database design.
- Developed many stored procedures using PL/SQL to implement complex business logic.
- Prepared Unit/ Systems Test Plan and the test cases for the developed mappings.
- Written documentation to describe development, logic, testing, changes and corrections.
Environment: Informatica PowerCenter 8.6, Oracle 11g, MySQL, AWS Redshift, HDFS, MSSQL Server 2012, Erwin 9.2, PL/SQL, Control-M, Putty, Shell Scripting, WinSCP, Notepad++, JIRA, Tableau 8.2