Senior Informatica Developer Resume
Chicago, IL
PROFESSIONAL SUMMARY:
- 8+ years of Information Technology experience in Informatica, Teradata, Oracle, PL/SQL, Data Analysis, Design, and Development for various software applications in a client - server environment in providing Business Intelligence Solutions in Data Warehousing for Decision-making Support Systems.
- Expert in involving business users for requirement analysis and to define business and functional specifications.
- Developed mappings to extract data from SQL Server , Oracle , Flat files , DB2 , Mainframes and load into Data warehouse using the PowerCenter, power exchange.
- Experience in Erwin Database programming for Data Warehouses (Schemas), proficient in dimensional modeling, Star Schema, Snowflake and Hierarchy modeling .
- Interacting with Data Management Teams for Business requirements and Problem solve.
- Loading data from various data sources and legacy systems into Teradata production and development warehouse using BTEQ, FASTEXPORT, MULTI LOAD, FASTLOAD and Informatica.
- Experienced in a fast Agile Development Environment including Test-Driven Development (TDD) and Scrum.
- Experience in configuration of Informatica Data Director (IDD) to meet data governance objectives in various project life cycles and MDM hierarchies develop.
- Extensive experience in using Microsoft BI tools SSIS, SSAS, and SSRS.
- Experience in Data Warehouse development starting from inception to implementation and ongoing support, Strong understanding of BI application design and development principles.
- Created subscriptions in CDC for data migration between ODS to local DB server.
- Experience in creating and understanding Data models, Data flows and Data dictionaries.
- Good understanding on file transport protocol like FTP , sFTP and AS2.
- Experience with Normalization and De-normalization processes.
- Experience in performance testing ETL mappings.
- Expertise in technical and business user’s teams for validation of ETL Test cases.
- Experience working on C# and .Net frontend applications.
- Good Technical in performance tuning, debugging, and troubleshooting within PowerCenter.
- Experience in writing complex T-SQL queries, Sub-queries, Dynamic SQL queries etc.
- Experience in scripting complex Stored Procedures for better Performance.
- Experience with SQL Server Constraints (Primary key, Foreign key, Unique key, Check key.
- Experience in applying SQL Server Indexes on database tables for high performance and Query Optimization.
- Technical expertise in unit testing, debugging, and troubleshooting with PowerCenter.
- Experience with Data Migration, Data Formatting and Data Validations.
- Experience with managing users and their authorizations.
- Highly experienced in whole cycle of DTS/SQL server integration services (SSIS … Packages (Developing, Deploying, Scheduling, Troubleshooting and monitoring) for performing Data transfers and ETL Purposes across different servers.
- Experience in Logging, Error handling by using Event Handler, and Custom Logging for SSIS.
- Experience with system and user defined Variables and Package and Project Configurations for SSIS packages.
- Real time experience in Data Modeling, Star Schema, Snowflake Schema, Slowly Changing Dimensions, Fact tables, Dimension tables, Normal forms, OLAP and OLTP.
- Experience in Informatica PowerCenter Tools - Repository Manager, Designer, Workflow Manager and Workflow Monitor and designing Mappings, Mapplets, Reusable Transformations, Transformations, Tasks, Worklets and Workflows.
- Expertise in creating tables, triggers, macros, views, stored procedures, functions, Packages in Teradata database.
- Good understanding of Data warehousing, Dimension Modeling and RDBMS concepts.
- Source data was extracted from Oracle 11i, SQL Server, flat files , COBOL sources
- Experience with Oracle 11g and SQL server 2012.
- Extensive experience on Informatica Power Center 9.5 /9.1 / 8.6
- Extensive experience in Power Exchange version 8.6 and more
- Experience in writing Complex SQL queries, stored procedures and functions using PL/SQL Programming.
- Experience in database programming in SQL, PL/SQL and well versed in UNIX shell scripting .
- Developing the SQL scripts in TOAD and creating Oracle Objects like tables, materialized views, views, Indexes, sequences, synonyms and other Oracle Objects.
- Proactively monitor and optimize query performance, session performance and fine tuning the mappings.
- Experience in Collecting and transforming data from heterogeneous data sources like Transactional Databases, Flat files, MS Excel, XML, etc. into the central data warehouse.
- Strong Quality Assurance, debugging and performance tuning skills in ETL Process and hands on experience in troubleshooting and handling production support jobs.
- Worked in complete Software Development Life Cycle (SDLC) Implementation from Requirement gathering, analysis, data modeling, design, testing, debugging, implementation, post-implementationsupport, and maintenance.
- Experience with Type 1, Type 2, Type 3 Dimensions.
- Team player and self-starter with good communication skills and ability to work independently and as part of a team.
TECHNICAL SKILLS:
ETL Tools: Informatica Power Center 9.5/9.1/8.6.0, ab initio 3, (Mapping Designer, Mapplet Designer, Transformation Developer, Workflow Manager, Workflow Monitor.
BI Tools: Tableau 7/8, IBM Cognos
Databases: Oracle 11g/10g/9i, DB28.x, SQL Server 2000/2005, Teradata, Netezza
Languages: SQL, PL/SQL, Shell Scripting
Operating Systems: UNIX (SOLARIS, AIX), Linux, Windows 95/98/NT/2000/XP and DOS
DB Tools: SQL Plus, SQL Loader, Toad, Power Designer, Erwin, SSIS, SSRS, SSAS
Scheduling Tools: Autosys
Web services: Web services, WSDL, SOAP, REST
Others: MS Office (MS Access, MS Excel, MS PowerPoint, MS Word, MS Visio, COSMOS).
PROFESSIONAL EXPERIENCE:
Confidential, CHICAGO, IL
Senior Informatica Developer
Responsibilities:
- Developed mappings for loading the staging tables from text files using different transformations.
- Implemented process automation for reoccurring production support operations, advised on ways to streamline, harden, de-risk and make more consistent the current software delivery and production support processes through tasks like the following: Automated serial number (Sn) assignments for product distribution channels (pdc) in QA/Test environments.
- Performed Data Cleansing and Conversion of Mainframes datasets from EBCIDIC to ASCII format.
- By using the Data stage FTP stage, we loaded the Mainframes data .
- Migrating Data from Mainframes to Legacy System (Db2)
- Provided production support to the Corporate Data Warehouse (CDW)
- Informatica Data Explorer (IDE) and Informatica Data Quality ( IDQ 8.6.1) are the tools are used here. IDE is used for data profiling over metadata and IDQ 8.6.1 for data quality measurement.
- Used Erwin for Logical and Physical database modeling of the warehouse, responsible for database schemas creation based on the logical models.
- Experience Databases like MySQL, Maria DB and NOSQL-DB for configuring the components.
- Worked closely with the team of all Application/Development Teams that used to Control-M Scheduling.
- Automation includes provisioning new servers, making sure the servers adhere to their role and maintain the desired state from a configuration perspective with Ansible.
- Involved in building the ETL architecture for Source to Target mapping to generate the target files.
- Developed Mappings, Mapplets, Transformations and Reusable transformations by using PowerCenter Designer
- Hand on experience in Relational Database Management System.
- Developed several mappings in Informatica by using the transformations like Unconnected and Connected lookups, Source Qualifier, Expression, Router, Filter, Aggregator, Joiner, Update Strategy, Union, Sequence Generator, Rank, Sorter, Normalizer, Transaction Control etc.
- Developed Workflows, Worklets and Tasks by using PowerCenter Workflow Designer.
- Optimized Sources, Targets, Mappings, Transformations and Sessions to increase session performance
- Extensively used mapping parameters, variables, parameter files, user defined functions in Informatica
- Used stored procedures to create a standard Time dimension, drop and create indexes before and after loading data into the targets.
- Responsible for error handling using Session Logs, Reject Files, and Session Logs in the Workflow Monitor
- Created indexes on database tables (SQL) and tuned the queries to improve the performance Performed Unit Testing and documented the results
- Worked on Data Conversion and Data Analysis and Data Warehousing to meet EDW requirements.
- Scheduled and monitored automated weekly jobs under UNIX environment.
- Used Unix script and commands for running the set of Informatica jobs in loop based on year month as well as running the other Informatica workflows from UNIX .
- Used parameters and variables to facilitate smooth transition between the development and production environments.
- Unit testing of individual modules and their integration testing.
- Debugged and sorted out the errors and problems encountered in the production environment.
Environment : Informatica Power Center 9.5.1 (Repository Manager, Designer, Workflow Manager, Workflow Monitor), Erwin, MDM, Control-M, SQL Server, TOAD, Oracle 11g, Data Integrator, SQL server 2008. LINUX, UNIX, IBM Data Storage, Scope Studio, MS SQL Server 2012.
Confidential, Chicago
Teradata/ETL Developer
Responsibilities:
- Track and communicate team velocity and sprint/release progress
- Worked with DBA for distribution key, random distribution and Organization Key on tables.
- Creating an External Table from a file.
- Involved in Performance/Query tuning. Generation/interpretation of explain plans and tuning SQL to improve performance.
- Involved in writing UNIX shell scripts to run and schedule batch jobs.
- Designed the Informatica mappings based on AB Initio code.
- Fixed the existing components by comparing the Informatica code with Ab Initio graph.
- Created ELT jobs and did the performance monitoring and logging .
- Created aggregations in data models defining the highest level of object for dimensional modeling
- Worked in Agile project management environment.
- Followed user inputs for agile development .
- Extensively worked on Connected & Unconnected Lookups, Router, Expressions, Source Qualifier, Aggregator, Filter, Sequence Generator, etc.
- Created data stores, project, jobs, and data flows using Data Integrator
- Created and maintained surrogate keys on the master tables to handle SCD type 2 changes effectively.
- Mentored Informatica developers on project for development, implementation, performance tuning of mappings and code reviews.
- Experience in Informatica B2B Data Exchange using Unstructured , Structured Data sets.
- Used Unstructured Data like PDF files , spreadsheets, Word documents, legacy formats, and print streams option to get normalized data using B2B Data Exchange of Informatica
- Used SQL tools like TOAD to run SQL queries and validate the data in warehouse and mart.
- Converted Oracle ddl's to Netezza ddl's.
- Created the format of the unit test documents per Netezza Framework .
- Expert in designing and scheduling complex SSIS
- Packages for transferring data manually from multiple data sources to SQL server .
- Extensively used Teradata utilities like Fast load , Multiload to load data into target.
- Did bulk loading of Teradata table use TPump utility.
- Validated the target data using SQL Assistant for Teradata .
- Developed Informatica mappings/mapplets, sessions, Workflows for data loads and automated data loads using UNIX shell scripts .
- Used various lookup caches like Static , Dynamic , Persistent and Non-Persistent in Lookup transformation.
- Involved in debugging mappings, recovering sessions and developing error-handling methods.
- Successfully migrated objects to the production environment while providing both technical and functional support.
- Developed E-MAIL tasks to send mails to production support and operations.
- Optimized data transformation processes in the Hadoop and Big Data .
- Involved in unit testing and documentation of the ETL process.
- Extensive Tableau Experience in Enterprise Environment. Experience includes technical support, troubleshooting, report design and monitoring of system usage.
- Extensively used ETL to load data from Flat files, DB2 and Oracle into Teradata.
- Used Informatica Designer to Extract & Transform the data from various source systems by incorporating various business rules. Also used different transformations, sessions and command tasks.
- Created mappings using different transformations like Aggregator, Expression, Stored Procedure, Filter, Joiner, Lookup, Router and Update Strategy.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Used Informatica partitioning to improve data loading performance.
- Developed shell scripts for job automation, which will generate the log file for every job.
- Created Informatica Frame work for Audit and Error Balancing.
- Used IBM Cognos for reporting.
Environment: Informatica PowerCenter9.x, Netezza, IDQ, SDFC, SSIS, SSAS, Hadoop, Microsoft SQL-Server, Aginity workbench, DB2, Flat Files.
Confidential, Mather, CA
Sr. Informatica Developer
Responsibilities:
- Analyzing the source data coming from different sources and working with business users and developers to develop the Model.
- Extracted, Transformed and Loaded OLTP data into the Staging area and Data Warehouse using Informatica mappings and complex transformations (Aggregator, Joiner, Lookup (Connected & Unconnected), Source Qualifier, Filter, Update Strategy, Stored Procedure, Router, and Expression).
- Developed number of complex Informatica Mappings, Mapplets and Reusable Transformations for different types of tests in Customer information, Monthly and Yearly Loading of Data.
- Using Workflow Manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process.
- Extracted data from various sources like Flat Files, SQL server and Oracle.
- Extensively Used Environment SQL commands in workflows prior to extracting the data in the ETL tool.
- Created Sessions, reusable Worklets and Batches in Workflow Manager and Scheduled the batches and sessions at specified frequency.
- Monitored the sessions using Workflow Monitor.
- Data Quality Analysis to determine cleansing requirements.
- Used stored procedures to create a standard Time dimension, drop and create indexes before and after loading data into the targets.
- Removed bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
- Captured data error records corrected and loaded into target system.
- Created Mappings, Mapplets and Transformations, which remove any duplicate records in source.
- Implemented efficient and effective performance tuning procedures, performed benchmarking, and these sessions were used to set a baseline to measure improvements against.
- Tuned Source System and Target System based on performance details, when source and target were optimized, sessions were run again to determine the impact of changes.
Environment: Informatica Power Center 8.5, MS SQL Server 2008, DB2, Oracle 10g, Teradata 13, Unix Shell Scripts, Toad.
Confidential, Bridgewater, NJ
Sr. Informatica Lead Developer
Responsibilities:
- Co-ordinated Joint Application Development (JAD) sessions with Business Analysts and source developer for performing data analysis and gathering business requirements.
- Developed technical specifications of the ETL process flow.
- Designed the Source - Target mappings and involved in designing the Selection Criteria document.
- Worked on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Teradata.
- Used Informatica PowerCenter to create mappings, sessions and workflows for populating the data into dimension, fact, and lookup tables simultaneously from different source systems (SQL server, Oracle, Flat files).
- Created mappings using various Transformations like Source Qualifier, Aggregator, Expression, Filter, Router, Joiner, Stored Procedure, Lookup, Update Strategy, Sequence Generator and Normalizer.
- Deployed reusable transformation objects such as Mapplets to avoid duplication of metadata, reducing the development time.
- Implemented Informatica Framework for (dynamic parameter file generation, start, failed and succeeded emails for an integration, Error handling and Operational Metadata Logging).
- Implemented sending of Post-Session Email once data is loaded.
- Worked with DBA for partitioning and creating indexes on tables used in source qualifier queries.
- Involved in Performance/Query tuning. Generation/interpretation of explain plans and tuning SQL to improve performance.
- Scheduled various daily and monthly ETL loads using Autosys.
- Involved in writing UNIX shell scripts to run and schedule batch jobs.
- Involved in unit testing and documentation of the ETL process.
- Involved in Production Support in resolving issues and bugs.
Environment: Informatica PowerCenter 9.x, Teradata, PL/SQL, Oracle 11g, Toad 8.0, UNIX Shell Scripting, Tableau.
Confidential
ETL Developer
Responsibilities:
- Analyzed source systems, business requirements and identified business rules for building the data warehouse.
- Developed Technical Specifications of the ETL process flow
- Designed and developed Informatica mappings, workflows to load data into Oracle ODS.
- Installed and Configured Informatica Power Center client tools and connected with each database in Data Ware house using repository server.
- Extensively used Informatica to load data from Oracle, XML and Flat Files to Oracle.
- Used Informatica workflow manager, monitor, and repository manager to execute and monitor workflows and assign user privileges.
- Extensively worked with Aggregator, Sorter, Router, Filter, Join, Expression, Lookup and Update Strategy, Sequence generator transformations.
- Set up Metadata driven utility design for the ETL processes using Informatica.
- Used debugger to test the mapping and fixed the bugs.
- Involved in tuning the performance of sessions and mappings.
- Used the Workflow manager to create workflows and tasks, and also created Worklets.
- Involved in Production Support in resolving issues and bugs.
- Worked on SQL stored procedures, functions and packages in Oracle.
- Scheduled and executed batch and session jobs on Autosys.
- Created and maintained UNIX shell scripts for pre/post session operations and various day-to-day operations.
- Developed unit and system test cases, using System Procedures to check data consistency with adherence to the data model defined.
Environment: Informatica Power Center 9.x, Oracle8i, PL/SQL, SQL
Confidential
ETL Developer
Responsibilities:
- Responsible for design, development, enhancement of Informatica mappings.
- Extensively used Informatica Designer to create and manipulate source and target definitions, mappings, transformations.
- Extensively used Informatica Server Manager to create the workflows, batches and scheduled them the mappings based on the user requirement.
- Creation of Transformations like Lookup and Source Qualifier Transformations in the Informatica Designer.
- Created various other Transformations like Aggregate, Expression, Filter, Update Strategy, Stored Procedures, and Router etc. and Fine-tuned the mappings for optimal performance.
- Involved in creation of Stored Procedures and tested in development and staging environment
- Tuning Informatica Mappings and Sessions for optimum performance.
Environment: Informatica Power center 8.6, Oracle, SQL, PL/SQL, and Windows XP