Lead Informatica Architect / Developer Resume
Jersey City New, JersY
SUMMARY
- Having 9.5+ years of Professional experience in Information Technology with extensive experience in Oracle SQL, PL/SQL and Support of Data Base Development, Data warehousing and Data integration solutions.
- Having 7.5+ years of experience in ETL(Informatica PowerCenter), Teradata with Development Life Cycle like Analysis, Design, Development, Testing, Implementation.
- 6+ Years of experience to lead the team with provided necessary guidance and training.
- Coordinating with Business to understand their requirements and Designing/implementing the (ETL) process such as Data Modeling, Physical & Logical data modeling using Star schema/ Snowflake modeling, Fact and Dimensions tables, Data Mart, Multidimensional modeling and De - normalization techniques.
- Experience in IDQ (Informatica Data Quality) to cleans, Analyze and profiling, Score cards.
- Certified Data Quality 9.x Developer Specialist.
- Experienced in working with tools like TOAD, Teradata SQL assistance and SQL plus for development and customization.
- Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Informatica Developer (IDQ) ).
- Connecting the ETL tool with various databases such as Oracle, Teradata, Netezza and applications like Salesforce CRM.
- Build the informatica jobs to extract the data from Salesforce CRM and load to ODS database based on IDL & Delta process and also purge & archive the historical data from Salesforce.
- Vast experience in Designing and developing complex mappings using various transformation logic like Unconnected and Connected lookups, Source Qualifier, Router, Filter, Expression, Aggregator, Update Strategy, Stored Procedure, Joiner, Rank, Normalizer, Sorter, External Procedure, Sequence Generator.
- Expertise in design and implementation of various types Slowly Changing Dimensions (SCD) types.
- Expertise in Business Model development with Dimensions, Hierarchies, Measures, Partitioning, Time Series, Cache Management and configuring email.
- Strong experience in writing Oracle Stored Procedures, Functions, Packages, triggers, sequences and synonyms
- Having experience to Creating Dynamic SQL by using EXECUTE IMMEDIATE & REF CURSORS (Native Dynamic SQL) and PL/SQL Collections were extensively used for high performance of stored procedures.
- Informatica Admin activity such as Configuration and Management of Users, Groups, Roles, deployment group, Services, connections, Set the permissions on user and group level.
- Installing the Informatica Patches, applying Hotfix and performing backups of the repository. Performed Purging, truncating logs and collecting the stats on repository tables to improve the repository performance
- Expertise in Administration tasks including Importing/Exporting mappings, copying folders over the DEV/QA/PRD environments, managing Users, Groups, and associated privileges.
- Upgrade the informatica PowerCenter & IDQ from 9.6.1 HF3 to 9.6.1 HF4.
- Extracted the Salesforce CRM data using Informatica Cloud and informatica power center to load it into Oracle Database. Implemented CDC logic to extract the data from Salesforce CRM to Oracle database.
- Creating and managing the Data Synchronization tasks, mapping, Task Flow, connections and Scheduler in Informatica cloud.
- Perform the admin activity in Informatica cloud such as creating and managing users & user groups and configuring the secure agent in windows server.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
- Experience in resolving on-going maintenance issues and bug fixes monitoring Informatica sessions as well as performance tuning of mappings and sessions.
- Experienced in Requirement gathering, designing (Technical Design) ETL interfaces, reviewing mapping design by team members as well as leading the team in coding, testing, SIT, UAT, performance tuning in developing Informatica mappings and workflows, production migration and providing stabilization support.
- Involved in Performance Tuning in Data Warehouse applications including the creation of materialized views, bitmapped indexes, Query Analyzing and partitions.
- Extensive experience in Teradata RDBMS using FASTLOAD, MULTILOAD, TPUMP, FASTEXPORT, Teradata SQL, Stored Procedure and BTEQ Teradata utilities.
- Effectively using on various oracle futures such as AUTONOMOUS TRANSACTIONS, Exception handling, analytic functions.
- Experienced in various databases such as Netezza, MS SQL server 2006's SQL and Stored Procedures.
- Designing and Implementing the KPIs (key performance indicator) to monitor periodically the system Health checks such as informatica server performance, ETL Load balance, Data Quality Checks and Data Analysis.
- Effectively writing the Scripts such as Unix Shell Scripts, Perl Scripts, DOS Batch Scripts to automate the File handing including exception, Email notification, and File conversion from/to MS EXCEL.
- Automate the File Transferring for both inbound and outbound from/to the different vendor systems by using FTP, SFTP and LFTPs.
- Automate the periodic maintenance for Database Password’s expired Check, Change and Database connectivity by using UNIX shell scripts.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center 8.x, 9.x, 10.x, IDQ 9.x, Informatica Cloud, Informatica Administration Console, Informatica Analyst, Informatica Developer.
Databases: Oracle 8i/9i /10g/ 11g, Teradata12 & 13, MS SQL Server 2000 & 2005, Netezza
Teradata Utilities: BTEQ, MultiLoad, FastLoad, FastExport, Multiloadexport, Tpump, TPT, Teradata SQL Assistant
Scripting: Unix Shell Scripts, Perl Scripts, DOS Batch Scripts.
Tools: Force.com (Beta),Toad, SQL Developer, MS Visio Power Designer.
Operating Systems: UNIX, Windows XP, Windows 2000 Server, Windows 7 & 8.
Methodologies: Data Warehousing Design, Data Modeling, Logical and Physical Database Design.
CM Tools: Serena Dimensions (PVCS), Star Team, VSS
Job Scheduler: CronTab 7, Maestro jobs, Autosys R11.0, Informatica Scheduling.
PROFESSIONAL EXPERIENCE
Confidential, Jersey City, New jersy
Lead Informatica Architect / Developer
Responsibilities:
- Configuring a session to send email when it fails or succeeds. Also can create separate email tasks for success and failure email.
- Effectively using the Secure Shell, or SSH, is a cryptographic (encrypted) network protocol for initiating text-based shell sessions on remote machines in a secure way and This allows a user to run commands on a remote machine's command prompt.
- Using Cygwin is an emulation layer that allows programs written for UNIX, GNU, or Linux systems (POSIX environment) to run on MS Windows platforms.
- Creating the RulePoint complex event processing software provides proactive monitoring and operational intelligence by delivering real-time alerts and insight into pertinent information, enabling to operate smarter, faster, more efficiently, and more competitively. Its real-time alerts are based on sets of conditions defined by business users.
- Achieved the complicated business logic by using Informatica’s Stored Procedure transformation.
- Established the connection and refresh the data from one DB to another DB by using the Oracle Materialized View
- Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.
- Prepared the required application design documents based on functionality required
- Designed the ETL processes using Informatica to load data from Oracle, Flat Files (Fixed Width), and Excel files to staging database and from staging to the target.
- Wrote the Algorithm for ETL (Extract, Transform and Load) team for Data Validation.
- Performed various transformation logics by utilizing the Source Qualifier, expression, filter, joiner, lookup, router, Update Strategy and rank
- Mostly worked on Dimensional Data Modeling using Data modeling Star Schema/Snowflake modeling, FACT & Dimensions tables, Physical & logical data modeling.
- Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions types.
- Used debugger to debug mappings to gain troubleshooting information about data and error conditions.
- Implementedparallelismin loads bypartitioningworkflows usingPipeline, Round-Robin, Hash, Key Range and Pass-through partitions.
- Migrated repository objects, services and scriptsfrom development environment to QA environment. Extensive experience in troubleshooting and solving migration issues and production issues.
- Wrote Queries, Procedures and functions that are used as part of different application modules
- Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.
- Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
Environment: Windows XP/NT, Informatica Powercenter 9.6.1, UNIX, Oracle 11g, SQL, PL/SQL,SQL Developer, MS VISIO, Unix Shell, CronTab 7.
Confidential, Marlborough, MA
Informatica Lead
Responsibilities:
- Creating and maintaining the informatica reusable objects such as Transformation, Mapplet and Worklet.
- Designing and implementing the ETL load with SCD Type1 & Type2 loads by using Informatica power center transformations.
- Designing and Implementing the KPIs (key performance indicator) to monitor periodically the system Health checks such as informatica server performance, ETL Load balance, Data Quality Checks and Data Analysis.
- Build the informatica jobs to extract the data from Salesforce CRM and load to ODS database based on IDL & Delta process and also purge & archive the historical data from Salesforce CRM.
- Effectively writing the Scripts such as Unix Shell Scripts, Perl Scripts, DOS Batch Scripts to automate the File handing including exception, Email notification, File conversion from/to MS EXCEL.
- Automate the File Transferring for both inbound and outbound from/to the different vendor systems by using FTP, SFTP and LFTPs.
- Automate the periodic maintenance for Database Password’s expired Check, Change and Database connectivity by using UNIX shell scripts.
- Involved in database performance tuning as well as Informatica performance tuning.
- Created the stored procedures and functions to automate gathering stats of the tables, creating Indexes, dropping indexes, enable constraints & disable constraints.
- Using Informatica Repository Query to analyze the Informatica Objects and its activity.
- Tuning the existing complex Oracle query by using Oracle index, hints, explain plan, gathering the statistics.
- Effectively design the database by using Stored Procedure, Functions, Package, Trigger, Materialized View, Index, Table Partitioning, External tables, constraints.
- Informatica Admin activity such as Configuration and Management Of Users, Groups, Roles, deployment group, Services, connections, Set the permissions on user and group level.
- Installing the Informatica Patches, applying Hotfix and performing backups of the repository. Performed Purging, truncating logs and collecting the stats on repository tables to improve the repository performance
- Expertise in Administration tasks including Importing/Exporting mappings, copying folders over the DEV/QA/PRD environments, managing Users, Groups, and associated privileges.
- Upgrade the informatica PowerCenter & IDQ from 9.6.1 HF3 to 9.6.1 HF4.
- UNIX, DataBase, Informatica Code migration to the higher environment.
- Designing and implementing the Data Cleansing by using Labeler, Standardizer, reference table, transformations in IDQ.
- Migrating IDQ objects into the Power Center.
- Effectively using the Address Validator, Exception transformations to implementing the complex business logic in IDQ (Informatica Data Quality).
- Profiling the source data by using the Informatica Analyst and IDQ to provide the better ETL solutions.
- Extracted the Salesforce CRM data using Informatica Cloud and informatica power center to load it into Oracle Database. Implemented CDC logic to extract the data from Salesforce CRM to Oracle database.
- Preparing and implementing the Unit Test, SIT and UAT case.
- Creating and working on ticketing system such as General Request, Change Request (SCR), incident and meet their SLA’s.
Environment:Informatica Power Center 9.6.1, 10.1, Informatica Developer 9.6.1, 10.1, Informatica Cloud, Informatica Admin, Informatica Analyst, Oracle 11G/12c, Toad, SQL Developer, Salesforce CRM, Flat files, UNIX, Windows7.
Confidential, Demopolis, Albama
Lead Informatica Developer
Responsibilities:
- Team lead, responsible for the assignment of tasks for the team members, creation of Metrics reports, weekly status reports, preparation of project plan and quality related documents & migration documents etc. Used share point to maintain documents and tracking.
- Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.
- Prepared the required application design documents based on functionality required
- Designed the ETL processes using Informatica to load data from Oracle, Flat Files (Fixed Width), and Excel files to staging database and from staging to the target Teradata Warehouse database.
- Wrote the Algorithm for ETL (Extract, Transform and Load) team for Data Validation.
- Built the BTEQ scripts to validate the Edge Model and to generate Surrogate Key.
- Performed various transformation logics by utilizing the Source Qualifier, expression, filter, joiner, lookup, router, Update Strategy and rank
- Mostly worked on Dimensional Data Modeling using Data modeling Star Schema/Snowflake modeling, FACT & Dimensions tables, Physical & logical data modeling.
- Implemented effective method of Change Data Capture (CDC) to capture the changes from source and implemented the same in target.
- Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions types.
- Developed and executed UNIX scripts to schedule loads, through wrapper shell scripts for calling Informatica workflows using PMCMD command.
- Created and managed different Power exchange directories like condense files directory, check point directory etc.
- Used debugger to debug mappings to gain troubleshooting information about data and error conditions.
- Implementedparallelismin loads bypartitioningworkflows usingPipeline, Round-Robin, Hash, Key Range and Pass-through partitions.
- Implementeddaily and weekly audit processfor theeachsubject area to ensure Data warehouse is matching with the source systems for critical reporting metrics.
- Migrated repository objects, services and scriptsfrom development environment to QA environment. Extensive experience in troubleshooting and solving migration issues and production issues.
- Interacting different type of source and target like Oracle, Teradata, DB2, Flatfile, XML files, EBCDIC files.
- Automated Test Scenarios using BTEQ scripts to validate Source System Data against the Data Mart Views generated.
- Wrote Queries, Procedures and functions that are used as part of different application modules
- Optimized Teradata SQL queries for better performance.
- Extract and Load data using different tools in Teradata like Multiload, FastExport, Fastload, OLE Load, BTEQ.
- Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.
- Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
Environment: Windows XP/NT, Informatica Powercenter 9.1/8.6, UNIX, Teradata 13.10, DB2, Oracle 11g, SQL, PL/SQL, Toad, MS VISIO, Unix Shell, Quality Center 10.,Power Exchange 8.6.1, Serena Dimensions (PVCS), Star Team, Maestro job
Confidential, Akron, Ohio
Senior DataBase Developer
Responsibilities:
- Extensive work experience in SQL and PL/SQL by writing Stored Procedures, Functions, Packages and Triggers.
- Experienced in ETL development and Data Migration using native tools of Oracle (SQL LOADER, Oracle External tables and PL/SQL)
- Exporting data using expdb and Importing data using impdb
- Familiarity with Oracle Data Warehousing features such materialized views, bitmap indexes, Index Organized Tables, external tables etc.
- Experienced in UNIX shell programming
- Experienced in Database Optimization by making use of various Performance Tuning techniques.
- Proficient in database and SQL tuning using HINTS and EXPLAIN PLAN.
- Very good knowledge and experience in using Oracle in-built packages like UTL FILE, DBMS SQL etc.
- Very good working knowledge in TOAD.
- Experienced in Dynamic SQL, PL/SQL Collections and Exception handling.
- Created stored Procedures using EXECUTE IMMEDIATE and REF CURSORS (Native Dynamic SQL).
- PL/SQL Collections were extensively used for high performance of stored procedures.
- Worked with AUTONOMOUS TRANSACTIONS in Triggers and Functions in order to include logging.
- Modified Packages, Procedures, Functions and Triggers for designing user interface.
- Designed and generated database objects and Oracle Forms, Reports, and Libraries through Oracle Designer.
- Trouble shouted performance issues and bugs within packages, forms, and reports using dbms output, Forms debugger
- Involved in creating Functional and Program Specification documents.
- Lead in the design, develop, and implementation of applications, interfaces, and other creations written in PL/SQL and UNIX.
- Stage tables/packages to load history files using SQLLDR and external tables
- Exporting data using expdb and Importing data using impdb
- Involved in the creation of Partitioned Tables and Indexes.
- Used Exception Handling extensively for the ease of debugging and displaying the error messages in the application
- Used the inner join, outer join, cross join while writing the complex SQL Queries.
- Worked on sequences, functions, synonyms, indexes, triggers, packages, stored procedures and granting privileges on OLTP server.
- Used FTP to transfer the files into different servers as needed by the business users.
- Involved in Creation of tables, Partitioning tables, Join conditions, correlated sub queries, and nested queries, views for the business application development.
- Worked on Bulk Collections for bulk load processing.
- Good knowledge on logical and physical Data Modeling using normalizing Techniques.
- Experience in Oracle supplied packages, Dynamic SQL, Records, PL/SQL Tables and Exception Handling.
Environment: Oracle 9i, SQL, PL/SQL, UNIX, Windows, TOAD, Oracle Forms and Reports, Serena Dimensions (PVCS).
Confidential, Cincinnati, Ohio
T-SQL Developer
Responsibilities:
- Writing SQL, T-SQL procedures According to the business needs.
- Transferring data from source to destination by using DTS or SSIS Package.
- Created DTS or SSIS packages with integration of stored procedures, functions.
- Effectively used the SSIS controls like FTP upload/download and mailing.
- Involved in Analysis, Design and Development and also responsible for preparing LLD document, UML Modeling, RTM, Check Lists and UTC documents.
- Reviewing the Low level design documents.
- Coordinating with onsite team and client frequently to get better understanding of requirement.
- Expertise in writingT-SQL Queries, Dynamic-queries, sub-queries and complex joins for generatingComplex Stored Procedures, Triggers, User-defined Functions, Views and Cursors
- Created SSIS Packages to extract data from OLTP to OLAP systems with different transformation and scheduled Jobs to run the packages on a daily basis.
- Supported team in resolving SQL Reporting services and T-SQL related issues and Proficiency in creating different types of reports such as Cross-Tab, Conditional,Drill-down,Top N, Summary, Form,OLAP and Sub reports, and formatting them.
- Experience inimporting/exportingdata between different sources like Oracle/Access/Excel etc. using SSIS/DTS utility.
Environment: DOT NET, MS SQL Server 2000 and 2005, SSIS SSRS, SSAS, VSS.