We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

3.00/5 (Submit Your Rating)

Fremont, CA

SUMMARY

  • Around 10 years of IT experience in all phases of Software Development Life Cycle (SDLC) which includes User Interaction, Business Analysis/Modeling, Design, Development, Integration, Planning and testing and documentation in data warehouse applications, ETL processing and distributed applications.
  • Excellent domain knowledge of Health care, Banking Financial, Manufacturing and Insurance.
  • 8+ years of Strong expertise in using ETL Tool Informatica BI, Power Center 10.0/9.x/8.x (Designer, Workflow Manager, Repository Manager, Data Quality (IDQ) and ETL concepts.
  • Extensive experience with Data Extraction, Transformation, and Loading (ETL) from disparate data sources like Multiple Relational Databases (Teradata, Oracle, SQL SERVER, DB2), VSAM and Flat Files
  • Experience in working with PowerCenter Web Services / Real Time Informatica cloudServices.
  • Experience in Informatica cloudServices with Real - Time/Batch mode to perform data extractions between Salesforce and Trust systems.
  • Experience in working with using Informatica in an SAP HANA environment.
  • Experience in working with Informatica data quality (IDQ).
  • Proficiency inIDQdevelopment around data profiling, cleansing, parsing, standardization, verification, matching and data quality exception monitoring and handling
  • Strong experience on data quality development process, issue management, and data remediation.
  • Experience in Informatica Power Center with Web Service Sources and Targets.
  • Experience in Hadoop Ecosystem: Cloudera 5.6 (HDFS, Spark, Hive, Impala, etc.)
  • Experience in working with Cloud environment using AWS product (RDS, S3 Bucket, SNS, Snowflake, AWS cli etc.)
  • Heavily used nested JSON file to load data from AWS s3 bucket to Snowflake.
  • Strong experience with Informatica tools using real-time Change Data Capture and MD5.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ) 10.0/9. x.
  • Worked with various transformations like Normalizer, expression, rank, filter, group, aggregator, lookups, joiner, sequence generator, sorter, SQLT, stored procedure, Update strategy, Source Qualifier, Transaction Control, JAVA, Union, CDC etc.,
  • Worked with Teradata utilities like Fast Load and Multi Load and Tpump and Teradata Parallel transporter and highly experienced in Teradata SQL Programming.
  • Experienced in Teradata Parallel Transporter (TPT). Used full PDO on Teradata and worked with different Teradata load operators.
  • Designing and developing Informatica mappings including Type-I, Type-II, slowly changing dimensions (SCD).
  • Experienced in using advanced concepts of Informatica like push down optimization (PDO).
  • Validating data files against their control files and performing technical data quality checks to certify source file usage.
  • Very good in data modeling knowledge in Dimensional Data modeling, Star Schema, Snow-Flake Schema, FACT and Dimensions tables.
  • Experienced in writing SQL, PL/SQL programming, Stored Procedures, Package, Functions, Triggers, Views, Materialized Views.
  • Experience in working with the Business Intelligence Development Studio (BIDS), which includes SQL server Integrated Services (SSIS), SQL Server Analytical Services (SSAS) and SQL Server Reporting Services (SSRS).
  • Experience in Performance Tuning and Debugging of existing ETL processes.
  • Experience in working with Power Exchange to process the VSAM files.
  • Experience in writing UNIX shell scripts to process Data Warehouse jobs, file operation purpose and data analytics purpose.
  • Coordinating with Business Users, functional Design team and testing team during the different phases of project development and resolving the issues.
  • Decent skills in defining standards, methodologies and performing technical design reviews.
  • Excellent communication skills, interpersonal skills, self-motivated, quick learner, team player.

TECHNICAL SKILLS

ETL Tools: Informatica cloud, Informatica Power Center10.0/ 9.x/8.x, Informatica PowerExchange 10.0/ 9.x/8.x, Informatica Data Quality 8.6

Languages: C, C++, SQL, PL/SQL, HTML, XML, UNIX Shell Scripting

Methodology: Agile RUP, SCRUM, Waterfall

Databases: Oracle 11g/10g, SQL Server 2012/2008, DB2, Teradata 15/14/13, UDB DB2, Sybase, SQL Server

Operating Systems: Windows NT, 2003, 2007, UNIX, Linux

SQL Server Tools: SSMS, Configuration Manager, Enterprise Manager, Query Analyzer Profiler, DTS, SSIS, SSAS, SSRS, Database Tuning Advisor, SQL* Plus

Data Warehousing/BI Tools: SQL Server Business Intelligence Development - SSIS, SSAS, SSRS, DTS, Performance Point Server, Self-service BI (Power BI, Power Pivot, Power View, Power Map, Power Query)

Modelling Tool: Erwin 9.1/7.2, MS Visio

Scheduling Tools: Control-m, Autosys

Hadoop / Big Data: AWS, S3 Bucket, SNS, Cloudera, HDFS, HBase, Spark, Hive, Impala

Reporting: Tableau 10/9, Cognos 10/9

Others: Salesforce.com, SQL Loader, MS Office, Ultra Edit, Autosys, Control-M, HP Quality Center, Teradata SQL Assistant

Others Tool: JIRA, Notepad++, Teradata view point MS office, T-SQL, SQL Developer, XML Files, JSON, GitHub, ORACLE ERP, PUTTY, SharePoint, SVN

PROFESSIONAL EXPERIENCE

Confidential, Fremont, CA

Sr. Informatica Developer

Responsibilities:

  • Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
  • Worked extensively with mappings using expressions, aggregators, filters, lookup, joiners, update strategy and stored procedure transformations.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
  • Developed mapping to load Fact and Dimension tables, for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.
  • PowerExchange for Hadoop accesses Hadoop to extract data from HDFS or load data to HDFS/Hive.
  • Coordinate and develop all documents related to ETL design and development.
  • Involved in designing the Data Mart models with ERwin using Star schema methodology.
  • Used repository manager to create repository, user’s groups and managed users by setting up privileges and profile.
  • Used debugger to debug the mapping and correct them.
  • Performed Database tasks such as creating database objects (tables, views, procedures, functions).
  • Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
  • Optimized the mappings and implementing the complex business rules by creating re-usable transformations and Mapplets.
  • Involved in writing BTEQ, MLOAD and TPUMP scripts to load the data into Teradata tables.
  • Optimized the source queries in order to control the temp space and added delay intervals depending upon the business requirement for performance
  • Used Informatica workflow manager for creating, running the Batches and Sessions and scheduling them to run at specified time.
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
  • Preparing Functional Specifications, System Architecture/Design, Implementation Strategy, Test Plan & Test Cases.
  • Implemented and documented all the best practices used for the data warehouse.
  • Improving the performance of the ETL by indexing and caching.
  • Created Workflows, tasks, database connections, FTP connections using workflow manager.
  • Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations and fixing bugs.
  • Code walks through with team members.
  • Developed stored procedures using PL/SQL and driving scripts using Unix Shell Scripts.
  • Created UNIX shell scripting for automation of ETL processes.
  • Used UNIX for check in’s and check outs of workflows and config files in to the Clearcase.
  • Automated ETL workflows using Control-M Scheduler.
  • Involved in production deployment and later moved into warranty support until transition to production support team.
  • Experience in monitoring and reporting issues for the Daily, weekly and Monthly processes. Also, work on resolving issues on priority basis and report it to management.

Environment: Informatica PowerCenter 10.2, IDQ 10.2, Oracle 11g, Teradata 15.1.0, Teradata SQL Assistant, MS SQL Server 2012, DB2, Erwin 9.2, DAC Scheduler, Putty, Shell Scripting, Clearcase, Putty, WinSCP, Notepad++, JIRA, Control-M V8, QlikView Reporting.

Confidential, Houston, TX

Sr. Informatica Developer

Responsibilities:

  • Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
  • Worked with Business Analysts and end users to correlate Business Logic and Specifications for ETL Development/ Informatica Cloudtasks Development.
  • Designed Informatica Cloudbased Integration solutions to extract, transfer and load data from SQL Server to Cloudbased Salesforce.
  • Worked extensively on Source, Target, Lookup, Expression, Filter transformations inInformatica Cloud.
  • Worked onInformatica Cloudand developed jobs like Data Synchronization Tasks, Data Replication Tasks, Mappings and Task Flows.
  • Actively participated in gathering the requirement documents, analyzing, designing and development using Informatica cloud and Sales Force.com.
  • Understand and Analyze Existing Sales Force Data Loader jobs.
  • Imported Hive table using Power Exchange.
  • Power Exchange for Hadoop accesses Hadoop to extract data from HDFS or load data to HDFS/Hive.
  • Created Informatica mapping which read data from HDFS and populated in ASW s3 in JSON format.
  • Wrote complex SQL query to load JSON data (present in s3) into snowflake database.
  • Applied data validation rules and process the data according to the requirement and insert/updated into target (Sales Force.com fields).
  • Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements.
  • Created custom schedules and task flows in Informatica Cloud for running data synchronization tasks and mappings tasks at regular intervals.
  • Created mapping documents to outline data flow from sources to targets.
  • Developed single dynamic ETL mapping to load more than 30 reference tables.
  • Used Sql queries to validate the data after loading.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
  • Automated/Scheduled the cloud jobs to run daily with email notifications for any failures.
  • Debugged mappings to gain troubleshooting information about data and error conditions.
  • Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
  • Optimized the mappings and implementing the complex business rules by creating re-usable transformations and Mapplets.
  • Preparing Functional Specifications, System Architecture/Design, Implementation Strategy, Test Plan & Test Cases.
  • Created full replication and incremental Informatica Cloudmappings as well as scheduled the jobs to run on specific intervals using Control M.
  • Schedule, Run and Monitor sessions by using InformaticaWorkflow Manager.
  • Created UNIX shell scripting for automation of ETL processes.
  • Involved in production deployment and later moved into warranty support until transition to production support team.

Environment: Informatica Cloud Real Time,Informatica CloudSecure Agent, Saleforce.com platform, Workflow & Approvals, Custom Objects, Custom Tabs, Email Services, Security Controls, Sandboxes (Configure Sandbox, Sandbox, test sandbox), InformaticaPowerCenter 9.6.1, MS SQL, Control M, AWS s3 bucket, SQL, PL/SQL, Oracle 11g, TOAD, SQL Server 2012, Putty, WinSCP, Tableau 10

Confidential, Durham, NC

Sr. Informatica Developer

Responsibilities:

  • Co-ordination from various business users’ stakeholders and SME to get Functional expertise, design and business test scenarios review, UAT participation and validation of data from multiple discreate sources.
  • Created Mappings using Mapping Designer to load the data from various sources, using different transformations like Source Qualifier, Expression, Lookup (Connected and Unconnected), Aggregator, Update Strategy, Joiner, Filter, and Sorter transformations.
  • Sourced data from SAP HANA using SAP Adapter in Teradata Target Tables.
  • Extracted data from a web service source, transform data using a web service, and load data into a web service target.
  • Experience in real time Web Services which performs a lookup operation using key column as input and provided response with multiple rows of data belonging to key.
  • Used Web Service Provider Writer to send Flat file target as attachments and also for sending email from within a mapping.
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.
  • Worked on Power Center Designer tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Used Debugger within the Mapping Designer to test the data flow between source and target and to troubleshoot the invalid mappings.
  • Wrote SQL queries using TOAD and SQL Developer to run SQL Queries and validate the data.
  • Scheduled Informatica Jobs through Autosys scheduling tool.
  • Involved in creating Informatica mappings, mapplets, worklets and workflows to populate the data from different sources to warehouse.
  • Responsible to facilitate load testing and benchmarking the developed product with the set performance standards.
  • Involved in designing the Data Mart models with ERwin using Star schema methodology.
  • Used Teradata utility like BTEQ, FLOAD MLOAD and TPUMP scripts to load the data into Teradata tables also writing BTEQ scripts to load target data
  • Involved in testing the database using complex SQL scripts and handled the performance issues effectively.
  • Developed mapping to load Fact and Dimension tables, for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.
  • Created Workflows, tasks, database connections, FTP connections using workflow manager.
  • Involved in Onsite & Offshore coordination to ensure the completeness of Deliverables.

Environment: Informatica PowerCenter 8.6.1/8.1.1 , Cognos 9, SQL Server 2008, IDQ 8.6.1, Oracle 11g, PL/SQL, TOAD, Bucket S3, Autosys Scheduler, UNIX, Teradata 13, Erwin 7.5, Teradata 14.1.0, SAP HANA, WinScp, Putty, Shell Scripting, Clearcase

Confidential, Peoria, IL

Informatica Developer

Responsibilities:

  • Involved in data masking project and developed Informatica mappings to mask production data using data maskingtransformation and other transformations.
  • Responsible for design and development of Salesforce Data Warehouse migration project leveraging Informatica PowerCenter ETL tool.
  • Designed and developed complex ETL mappings making use of transformations like Source Qualifier, Joiner, Update Strategy, Lookup, Sorter, Expression, Router, Filter, Aggregator and Sequence Generator etc.,
  • Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.
  • Performed data integration and lead generation from Informatica cloud into Salesforce cloud.
  • Created summarized tables, control tables, staging tables to improve the system performance and as a source for immediate recovery of Teradata database
  • Design and execute a Data Quality Audit/Assessment
  • Design and execute the data quality mappings that cleanse, de-duplicate, and otherwise prepare the project data
  • Implement data quality processes including transliteration, parsing, analysis, standardization and enrichment at point of entry and batch modes
  • Utilize to design and develop custom objects and rules, reference data tables and create/import/export mappings
  • As per business requirements performed thorough data profiling with multiple usage patterns, root cause analysis and data cleansing and develop scorecards utilizing Informatica data quality tools
  • Extracted the Salesforce CRM information into BI Data Warehouse using Force.com API/Informatica on Demand to provide integration with oracle financial information to perform advanced reporting and analysis.
  • Created Stored Procedures to transform the Data and worked extensively in T-SQL, PL/SQL for various needs of the transformations while loading the data into Data warehouse.
  • Developed transformation logic as per the requirement, mappings and loaded data into respective targets.
  • Used pmcmd command to run workflows from command line interface.
  • Worked with Informatica Cloud Data Loader for Salesforce, for reducing the time taken to import or export critical business information between Salesforce CRM, Force.com.
  • Performed data quality analysis to validate the input data based on the cleansing rules.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Extensively worked on Unit testing for the Informatica code using SQL Queries and Debugger.
  • Used the sandbox for testing to ensure minimum code coverage for the application to be migration.
  • Used PMCMD command to start, stop and ping server from UNIX and created UNIX Shell scripts to automate the process.
  • Improved performance testing in Mapping and the session level.
  • Worked with UNIX shell scripts extensively for job execution and automation.
  • Coordinated with Autosys team to run Informatica jobs for loading historical data in production.
  • Documented Data Mappings/ Transformations as per the business requirement.
  • Created XML, Autosys JIL for the developed workflows.
  • Extensively involved in code deployment from Dev to Testing.

Environment: Informatica PowerCenter 8.6, Informatica Data Quality (IDQ) 8.6, SQL Server 2008, Oracle 10g, Shell Scripts, Teradata 13, SQL, PL/SQL, UNIX, Toad, SQL Developer, HP Quality Center

Confidential, Winston-Salem, North Carolina

Sr. Informatica Developer

Responsibilities:

  • Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.
  • Reviewing the requirements with business, doing regular follow ups and obtaining sign offs.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions (SCD) type 2 and type 1.
  • Used Debugger to test the mappings and fixed the bugs.
  • Used various transformations like Filter, Expression, Sequence Generator, Source Qualifier, Lookup, Router, Rank, Update Strategy, Joiner, Stored Procedure and Union to develop robust mappings in the Informatica Designer.
  • Done analysis of Source, Requirements, existing OLTP system and identification of required dimensions and facts from the Database.
  • Created SSRS Reports on daily, weekly and monthly basis for manger’s review.
  • Tuning Informatica Mappings and Sessions for optimum performance.
  • Developed various mapping by using reusable transformations.
  • Prepared the required application design documents based on functionality required.
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files (Fixed Width and Delimited) to staging database and from staging to the target Warehouse database.
  • Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed. If the session fails debug the Mapping.
  • Involved in testing Unit and integration Testing of Informatica Sessions, Batches, fixing invalid Mappings
  • Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
  • Developed and executed scripts to schedule loads, for calling Informatica workflows using PMCMD command.
  • Worked on Dimensional Data Modeling using Data modeling tool Erwin.
  • Populated Data Marts and did System Testing of the Application.
  • Built the Informatica workflows to load table as part of data load.
  • Wrote Queries, Procedures and functions that are used as part of different application modules.
  • Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.
  • Created Informatica Technical and mapping specification documents according to Business standards.

Environment: Informatica Power Center 8.1, IDQ 8.1, Oracle 10g, Toad, SQL Developer, UNIX

We'd love your feedback!