We provide IT Staff Augmentation Services!

Sr. Etl Informatica Power Center/ Cloud Integration Developer Resume

SUMMARY

  • Over 10+ years of Strong IT development experience as a Lead Informatica Developer with Informatica suite of products, Teradata, Oracle skillset with good client facing skills and strong written / oral communication skills.
  • Proven experience in developing Enterprise Data Warehouse solutions. Designing Business/ETL process flows, Documenting Data Dictionaries, Defining Data Management tasks, Deliverables, Strategies, Standards, and Best Practices in Industry.
  • Over 8 years of Information Technology Experience in ETL and Data Warehouse Technologies using Informatica Power Center, Informatica Data Quality, BigData Management, and Informatica Power Exchange 8.6.1/5.2 With Teradata, Netezza and Oracle Ware House Builder.
  • Designed and developed mappings using Source Qualifier, Expression, connected - Lookup and unconnected-Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner & Rank transformations etc.
  • Worked exclusively on all new Transformations in Informatica Data Quality(IDQ) and Master Data management(MDM) like Address Validator Transformation, Association Transformation, Case Converter Transformation, Comparison Transformation, Consolidation Transformation, Key Generator Transformation, Labeler Transformation, Match transformation, Merge Transformation, Parser Transformation, SQL Transformation, Standardizer Transformation, Web Service Consumer Transformation, weighted Average Transformation.
  • Experience on Hadoop Ecosystems Hive, Pig, Sqoop, Impala, HBase, Flume, Kafka, Storm, Oozie and ZooKeeper. Experience in writing HiveQL queries and Pig Latin scripts. Created UDFs for performing the data transformations as per the requirements.
  • Expertise in Data Warehousing with variety of Data manipulating and Data Governance skills like Data Migration, Data Modeling, Data Profiling, Data Cleansing and Data Validation.
  • Strong experience in T-SQL (DDL, DML) documentation, coding, programming and implementation of Business Intelligence solutions using SQL Server 2000/2005/2008 R2.
  • Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling.
  • Experience in Data Modeling using Erwin Data Modeler (Erwin 4.0/3.5.5) & ER Studio tool for both Physical & logical data modeling.
  • Proficient in using Informatica Data Explorer(IDE), Informatica Developer(Dv), Data Analyst and Data Validation Option(DVO).
  • Used Informatica Data Analyst to perform Data Quality improvement actions like Analyze, Cleanse, Standardize, Profile and Score data and performed column and rule profiling, Score Carding, manage reference data, bad record and duplicate record management.
  • Experience on all Informatica components like Repository R, Designer D, Workflow Manger W and Workflow Monitor M.
  • Experience in working on Agile Methodology. Conducts "Scrum" meetings, daily stand-ups, reviews and retrospectives. Responsible for keeping "Sprints" on track.
  • Experience in integration of Heterogeneous data sources like Oracle, Teradata, SQL Server 2005/2008, Sybase, DB2, COBOL Copybook Mainframe files, VSAM files, XML Files, Flat Files and MS Access into staging area as well as from homogenous sources.
  • Experienced with Informatica Power Exchange (9.6.1/8.6) using Power exchange listener, power exchange navigator for creating datamaps to Load/Retrieve data from mainframe systems and other Enterprise level Data for reduced and cost effective bulk loading mechanism.
  • As an Informatica Master Data Management (MDM) Analyst used to work with business users to identify, scope, analyze and finalize requirements forMDMhub implementation.
  • Tasks involve all phases of theMDMHub implementation process including requirements analysis, system design, programming, unit testing, SIT (systems Integration testing), roll-out, maintenance, and operational support.
  • Experience working in Oracle8i/9i/10g/11g with database objects like triggers, stored procedures, functions, packages, views and indexes.
  • Expertise in working with Teradata V2R5/V2R6, R13 systems and used utilities like Multi-Load, Fast-Load, Fast-Export, BTEQ, TPUMP, SQL Assistant.
  • Worked with Global Temporary tables and Volatile tables in Teradata and created secondary Indexes in joining two tables for fast retrieval of data, to improve performance and there by efficiency.
  • Worked with Oracle Data Integrator (ODI) Works in the staging/integration layer along with Pre-built modules for CDC, bulk loading etc.
  • Extensive experience in performance tuning, identified and fixed bottlenecks and tuned the complex Informatica mappings for better performance.
  • Scheduled jobs and automated the workflows using the Autosys Job Scheduler, Control-M Job Scheduler and Informatica Scheduler.
  • Maintain accurate data in project governance tools (HP Quality Center, JIRA, Clear Case & Clear Quest) including project plan estimates, Business Requirements, Design document, Unit/Integration/System Test Plan & Results, Deployment Plan & Schedule, User Response, Affirmation from User to close the Project & Lessons Learned.
  • Used T-SQL for creating Triggers and stored Procedures using SQL Server 2005/08 to create reports.
  • Experience in using Crystal Reports, Business Objects XI, SSRS and SAP BO in design and development of Business Intelligence reports.
  • Created complex SQL queries and used in creation of views and functions for generating complex reports.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.6/8.6, Informatica IDQ 9.5, Informatica MDM 9.6, Informatica BigData Edition(BDE/BDM), Data Analyst, Power Exchange 9.6/8.6, Informatica B2B DT Studio and Power Mart, OWB and ODI.

Languages: SQL, PL/SQL, T-SQL, C, C++, UNIX Shell Scripting, Perl Scripting, HTML, Java Scripting, XML.

Databases: Oracle 8i/9i/10g/11g, Teradata V2R5/V2R6, R13, MS SQL Server 2000/2005/2008 ,Sybase, Flat Files, XML Files, IBM Mainframe Systems and SFDC (salesforce.com)., AS 400.

Data Modeling, Data Architecture: Erwin 3.x/4.x, ER Studio, Logical Modeling, Physical Modeling, Relational Modeling, ER Diagrams, Dimensional Data Modeling, Star Schema, Snow-Flake, Fact and Dimensions Tables.

Query & Utility Tools: TOAD 11.X/10.X/8.x/7.x, SQL * Plus, SQL Loader, Teradata SQL Assistant 7.1, BTEQ, Fast Export, Sybase Advantage, Fast Load, Mload, Teradata TPT Scripts, Oracle SQL developer, SQL * loader.

Reporting Tool: Business Objects XI R3, OBIEE 11.1, SAP Business Objects 4.2, and Cognos reporting 8, 8.4, and 10. SSRS.

OS & Packages: ERP Packages

SDLC Methodologies: Agile, Waterfall and Scrum, RAD.

PROFESSIONAL EXPERIENCE

Confidential

Sr. ETL Informatica Power Center/ Cloud Integration Developer

Responsibilities:

  • Migrated sales data from legacy SAP systems to co-star cloud system using Informatica power center 10.1, and shell scripts.
  • Received Co-Star(third party cloud system) retail-financial delimited data files using SFTP shell scripts.
  • Processed the data Informatica Power Center 10.1 version and loaded data into SAP systems using SAP/ALE prepare for idoc transformations.
  • Accessed real-time data from third party cloud systems using shell scripts and integrated to SAP using Informatica Power Center 10.1 for reporting solutions.
  • Worked on Informatica BigData tool moving data from Oracle, SAP sources to Hive and Cassandra databases.
  • Working in Amazon web services cloud environment. Used Datastax enterprise package which includes Spark, Cassandra (NoSQL database).
  • Developed the batch scripts to fetch the data from AWS S3 storage and do required transformations in Scala using Spark and save to Cassandra keyspace.
  • Created different keyspaces in Cassandra, one as a intermediate keyspace and the next final keyspace is to save the aggregated data from which micro strategy team will use for reports and dashboards visualization.
  • Decision making on structure/DDLs of the tables created in Cassandra with appropriate partition, clustering keys based on the business requirement and look up patterns on the tables.
  • Used appropriate spark-submit parameters like number of executor cores, driver/executor memory, consistency levels while reading and writing to Cassandra for the better performance and also the appropriate parameters to avoid tombstone issues in Cassandra.
  • Involved in creating mapping documents for source to target column mappings and business rules required for transformations and aggregations.
  • Monitor the Spark Jobs in Spark masterurl. Used Git Repository for code management.

Confidential, Tampa, FL

Sr. ETL Informatica Cloud Integration Developer

Responsibilities:

  • Migrated sales data from legacy system to Salesforce using Informatica Cloud integration.
  • Successfully integrated Sales data into Salesforce using transformation logic as per the business requirement.
  • Migrated Integrated Salesforce data from Salesforce Cloud to Oracle and SQL Server for reporting purposes using Informatica Cloud mappings, Power center9.5 and Micro Strategy.
  • Integrated Salesforce data and Callidus Cloud data for reporting on Callidus cloud reporting
  • Worked on integrating Truecomp commissions and compensation plans calculations from Callidus Cloud and integrated that information into DataMart along with Salesforce.
  • Integrated Workday cloud data using Informatica cloud connector and integrated data into Oracle system and SQL Server for Data Mart reporting.
  • Implemented Java Code to fetch data from Eloqua marketing cloud to Oracle database and implemented ETL Informatica transformation logic to move data into EDW systems and finally to Sales Data Mart.
  • Implemented multiple fall back scenarios which will lookup the same source multiple times as per the requirement.
  • Executed multiple Proof of Concepts on fetching data from cloud systems to integrate data between Cloud systems and on-premise databases.
  • Implemented Data Mart reporting solution for data coming from Cloud systems.
  • Implemented Complex business logic with SCD type 2 retention in DataMart.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Worked in Agile Development model using VSTS and participated in scrum meetings in project implementation.

Confidential

Sr. ETL Informatica Developer

Responsibilities:

  • Involved in different phases of Data Warehouse life cycle ranging from project planning, requirements gathering and analysis, ETL design, ETL development, Unit testing, Quality Assurance, Reporting, integration, regression and user acceptance testing, implementation, system documentation, support and offshore coordination.
  • Extensive Experience in Design and Development of ETL Process using Informatica Mappings using various Transformations such as Source Qualifier, Expressions, Filters, Joiners, aggregators, Lookups(connected and Unconnected), Update strategy, Sequence Generator, Routers and XML to load consistent data into Database.
  • Exclusively worked on Informatica Test Data Management tool for Data protection from Fraud and theft, created masked test data sets for lower environments like DEV, QA, Model regions for the Federal compliance regulations project.
  • Created policies, standard Rules, advanced rules, masked columns as per requirement by the business, generated TDM masking workflows and produced unit testing results.
  • Created schedules, run time environments and installed Informatica cloud secure agent on windows server.
  • Trouble shooted issues with Firewall with network team and created proxy server settings to connect to Informatica cloud.
  • Accessed Salesforce and Web Services through Informatica cloud connectors.
  • Created Mapping Configuration tasks, Data Synchronization tasks, Data Replication tasks and mappings to access data from cloud systems to on premise databases for reporting
  • Implemented masking process for almost 200 databases in the financial environment, and worked on creating rules and updating rules as per the business requirements.
  • Implemented masking process on tables which are residing on Informatica Cloud environment and masked all the database tables which come under PII Project.
  • Masked lower levels of databases DEV, QA, Model regions with masked data so that any compromise of data through unprecedentedly access will not result in fraud and theft of Master data.
  • Exclusively worked on creating Technical Design documents by gathering business Requirements from Business Users and developed source to target design documents for development of workflows.
  • Developed mappings/Reusable Objects/Transformation/Mapplets by using mapping designer, transformation developer and Mapplet designer in Informatica Power Center 9.6.
  • Worked with business owners in gathering requirements and prepared logical and Physical data models using CA Erwin Data modeler.
  • Worked with Financial IT team in loading the data from relational databases and loading them into Data Warehouse Netezza.
  • Involved in writing SQL Stored procedures and Shell Scripts and Perl Scripts to access data from Oracle, DB2 and Flat files to load data into NETEZZA Data Warehouse.
  • Worked in Revenue Financial, Sales and billing teams and implemented financial calculations in producing reports with Very accurate results in numbers, by performing multiple validations.
  • Extensive experience in performance tuning, identified and fixed bottlenecks and tuned the complex ETL Informatica mappings for better performance.
  • Worked on SAP BO reporting in preparing Adhoc and canned reports.
  • Implemented different Tasks in workflows which include Session, Command, E-mail, Event-Wait, Event - Raise, Timer etc.
  • Created and Executed workflows and Worklets using Workflow Manager to load the data into the Oracle Database.
  • Strong Experience in implementing ETL Informatica for Efficient data loading from different sources oracle, DB2, SQL Server2005/2008, and Flat files.
  • Performed extraction, transformation and loading of data from heterogeneous RDBMS tables, Flat Files, and written SQL to get data into Teradata in accordance with requirements and specifications.
  • Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling the jobs (work flows)
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Prepare test cases & perform Unit / Integration / System / Regression Testing. Assist Business during UAT.
  • Involved in Test Data Management team and worked on assigning policies and rules to the tables across different databases and validated the TDM workflows.
  • Generated masked data which masks different elements of secure data like Personally Identifiable Information-PII, Personal Card Information- PCI.
  • Automation of job processing using ESP scheduler, establishing automatic email notifications to the concerned persons by creating email tasks in workflow manager
  • Expertise in working with Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Tables, Constraints, Views, Indexes, Sequences.

Confidential, Lake Forest, IL

Sr.ETL Developer (Lead)-Business Analytics Analyst

Responsibilities:

  • Extensive Experience in Design and Development of ETL Process using Informatica Mappings using various Transformations such as Expressions, Filters, Joiners, aggregators, Lookups(connected and Unconnected), Update strategy, Sequence Generator, Routers and XML to load consistent data into Database.
  • Exclusively worked on creating Technical Design documents by gathering business Requirements from Business Users and developed source to target design documents for development of workflows.
  • Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center 9.1.
  • Worked with Teradata team in loading the data using relational connections and using different utilities to load the data like BTEQ, Fast Load, Multi Load, Fast Export, TPUMP, TPT scripts.
  • Involved in writing SQL Stored procedures and Shell Scripts and Perl Scripts to access data from Oracle, Teradata and Flat files.
  • Worked in Revenue Financial, Sales and billing teams and implemented financial calculations in producing reports with Very accurate results in numbers.
  • Extensive experience in performance tuning, identified and fixed bottlenecks and tuned the complex Informatica mappings for better performance.
  • Worked on SAP BO reporting in preparing Adhoc and canned reports.
  • Worked with Global Temporary tables and Volatile tables and used secondary Indexes in joining two tables for fast retrieval of data, to improve performance and there by efficiency.
  • Implemented different Tasks in workflows which include Session, Command, E-mail, Event-Wait, Event - Raise, Timer etc.
  • Created and Executed workflows and Worklets using Workflow Manager to load the data into the Oracle Database.
  • Strong Experience in implementing ETL Informatica for Efficient data loading from different sources oracle, TERADATA, SQL Server2005/2008, DB2 and flatfiles.
  • Involved in different phases of Data Warehouse life cycle ranging from project planning, requirements gathering and analysis, offshore coordination, ETL design, ETL development and testing, Reporting, integration, regression and user acceptance testing, implementation, system documentation and support.
  • Performed extraction, transformation and loading of data from RDBMS tables, Flat Files, SQL into Teradata in accordance with requirements and specifications.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Create UNIX shell scripts, ETL mappings, sessions & workflows using Informatica Power Center.
  • Worked in team as a part of BigData Initiative Grainger is projecting a certain portion of Sales transactional data in Actian Pervasive, which is one of the fastest databases for Analytics.
  • Automation of job processing using Autosys scheduler, establishing automatic email notifications to the concerned persons by creating email tasks in workflow manager
  • Expertise in working with Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Tables, Constraints, Views, Indexes, Sequences.

Environment: Oracle 11g, Teradata 14, Informatica Power Center 9.1, SAP Business Warehouse, PL SQL, Fast Load, MultiLoad, TPump, TPT Scripts, Autosys, Erwin, Jira.

Confidential

Sr. ETL Developer

Responsibilities:

  • Strong Experience in implementing ETL Informatica for Efficient data loading from different sources oracle, TERADATA, SQL Server2005/2008, DB2 and Flat files.
  • Applied Transformation logic in various complexities in transforming and transferring the data into downstream Teradata EDW.
  • Implemented scripts Teradata Utilities like Fast LOAD, MultiLoad, TPUMP, Fast EXPORT, TPT Scripts in scheduling the loads to EDW Teradata.
  • In depth experience in Business Intelligence Technologies and Database with Extensive Knowledge in ETL Process and Reporting Services using SQL server 2008/2005, SSIS and SSRS
  • Configured and maintained Report Manager and Report Server for SSRS
  • Strong at developing Custom Reports and different types of Tabular Reports, Matrix Reports, Adhoc reports and distributed reports in multiple formats using SQL Server Reporting Services(SSRS).
  • Expert in of various source transformations, including flat files, XML and relational systems.
  • Proficient in analyzing Data warehouse by building cubes using SQLServer Analysis Services(SSAS).
  • Strong experience in defining referenced relationships and in identifying KPIs in SSAS. Worked exclusively for Asset management team for Financial Investments where Enterprise can work with in the markets.
  • Worked exclusively for Asset management team for Financial Investments where Enterprise can work with in the markets.
  • Responsible for all Informatica MDM hub development/Configuration (match rules, trust framework, IDD, Hierarchy Manager (HM).
  • Designed and implemented stored procedures and triggers for automating tasks in SQL Server 2005/2008, Oracle 10g.
  • Leading and managing enterprise technical Master Data management (MDM) solutions
  • Owning and managing issue / conflict resolution; demonstrated problem solving and decision making skills
  • Worked on SAP BO reporting in preparing Adhoc and canned reports.
  • Providing expertise and architectural guidance for solution delivery
  • Working effectively in a fast-paced environment as part of a high-performing research, delivery and sustainment team
  • Leading requirement analysis, system analysis, design, development, testing, implementation
  • Managing a Tier 1 technical environment including; driving continuous improvement through managed metrics and change management
  • Used Quality Stage to check the data quality of the source system prior to ETL process.\Participated in cross-functional efforts to support other teams - such as ETL and database tuning to support SSRS Reporting.
  • Implemented scripts Teradata Utilities like Fast LOAD, MultiLoad, TPUMP, Fast EXPORT, TPT Scripts in scheduling the loads to EDW Teradata.
  • Worked with Global Temporary tables and Volatile tables and used secondary Indexes in joining two tables for fast retrieval of data, to improve performance and there by efficiency.
  • Experience in using Normalizer transformation for normalizing the XML source data.
  • Extensively used XML Parser transformation to generate target XML files.
  • Involved in creation of various Unix Scripts and Perl Scripts which help the ETL scheduling jobs and help in initial validations of various tasks.
  • Worked on PL SQL Scripts and turned the stored Procedures in PL SQL into ETL Informatica Scripts.

Environment: Informatica Power center 9.6, MDM 9.6,Teradata v13, Oracle 10g, TOAD, Oracle SQL developer, Microsoft SQL Server 2005/2008, DB2, HP Quality center, XML, Flat Files, Windows, Accurev, SAP ECC6.0, HP-UNIX, OBIEEE11.1, Autosys.

Confidential, Cedar Rapids, IA

Sr. ETL Informatica Developer

Responsibilities:

  • Co-ordination with Data Modeler, Business & Systems (ETL & Reporting) team for Business requirements.
  • Translate Functional Requirements Document into Technical Requirements.
  • Manage Production & Non-Production Informatica Domains, Grids, Nodes and Repositories.
  • Upgrade Informatica Power Center from 8.6 to 9.1 and 9.5 and apply hot fixes and patches to upgrade software.
  • Worked on getting data from multiple databases Oracle, AS 400, SQL Server and flat files which are source systems, and integrating data from these sources to Target Datawareshouse.
  • Configured the integration servers with different code pages.
  • Maintenance of ETL Repositories & Integration Services using Admin Console.
  • Create the users and user groups in Admin Console.
  • Create folders in Repository Manager and provide appropriate privileges to users.
  • Create Connection Strings in Workflow Manager and grant appropriate access to developers.
  • Perform Application Deployment / Migration biweekly and monthly. This includes Informatica deployment, Database Changes, UNIX and setting up Autosys JIL.
  • Create UNIX shell scripts, ETL mappings, sessions & workflows using Informatica Power Center.
  • Prepare test cases & perform Unit / Integration / System / Regression Testing. Assist Business during UAT.
  • Work with Test Leads & approve / reject defects in HP Quality Center 11.0
  • Facilitate communication and collaboration within and around the team (release & sprint planning, user stories, etc.). Work with Agile Scrum teams to help resolve issues and conflicts.
  • Extensively worked on Installing Informatica Data Quality(IDQ) 9.1 and upgrading to 9.5 on windows machines.
  • Support true ad hoc query by having predictable and consistent query performance for all queries and no requirement for known query workloads or precomputation of aggregates or summaries using tableau.
  • Installed Address Reference data by using Data Quality Content Installer.
  • Worked exclusively on Transformations like Address Validator Transformation, Association Transformation, Case Converter, Comparison trf, Consolidation trf, Key Generator trf, Labeler trf, Match transformation, Merge Transformation, Parser Transformation, SQL Trf, Standardizer trf, Web Service Consumer Trf, weighted Avg trf, XML trf, HTTP trf.
  • Provide technical guidance and mentor junior resource & offshore team members.
  • Performance Tuning of ETL and share ETL best practices with the developers.
  • Work with Business Users & resolve issues logged / assigned in HPQC within the SLA.

Environment: Informatica Power Center 9.1/8.6,Informatica IDQ, Hyperion Essbase, HPQC, ERWIN 3.5, Oracle9i, SQL*PLUS, TERADATA V2R5, UNIX, SQL SERVER 2005, TOAD, Business Objects, Autosys, Tableau.

Confidential

Sr. ETL Informatica Developer

Responsibilities:

  • Setup the ETL environment for new project.
  • Maintenance of ETL Repositories & Integration Services using Admin Console.
  • Data Migration from Legacy Servers which run on Oracle and ODI(Oracle Data Integrator) ETL and PLSQL code to run on new servers with Oracle and ETL Informatica Power Center 8.6.
  • Created New mappings by using old transformation logic in ODI, and PLSQL Code and Created additional transformations to improve logic and functionality as an increment to existing phase designs.
  • Installed Informatica Power Center on New servers and migrated data from Legacy Servers by troubleshooting all the performance issues.
  • Installed and Configured Power Exchange 8.6
  • Created Data Maps using Power Exchange Navigator as per request from the Developers.
  • Upgrade to Informatica 8.6 from 8.1
  • Installed Informatica 8.6 in Linux server, configured the domain, added nodes to the existing domain and configured the Repositories.
  • Created the users and user groups.
  • Perform Data Map Migration.
  • Creating Backup and Recovery Plan
  • Create and Configure Informatica Data Quality (IDQ) Services - (Model Repository Service, Data Integration Service and Analyst Service) using Admin Console.
  • Configured Data Base Connection in IDQ using Admin Console.
  • Designed and developed mappings using Source Qualifier, Expression, connected-Lookup and unconnected-Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner & Rank transformations.
  • Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center 9.1.
  • Experience working on KPIs (key Performance Indicators) in software development projects on CPP platform.
  • Worked in different claim amounts which generate the Revenue for Confidential, which include a wide range of categories from different parts of state.
  • Worked on Claims from different insurance companies which generate Medicaid amounts to Confidential and created ETL logic to generate the revenue amount in Staging and target by using a vast Business logic.
  • The most important KPIs are analyzed, and their usage in the process efficiency evaluation is discussed with the Business owners.
  • Involved in Converting and gathering necessary calculation of PL SQL complex calculations into Informatica mappings.
  • Being on call support rotation over nights and weekends and provided assistance over phone in getting the issues resolved.
  • The outcome of the measurement is used to initiate further process adjustments and improvements. In addition, there is possibility to perform benchmarking between different development projects, and based on collected data easier search for best practices in the projects that can be broadly implemented.
  • Working closely with users/developers and administrators to resolve the production problems by reviewing design changes.
  • Participating in product and solution training to acquire and maintain a detailed level of product knowledge of core components of IDQ offerings and assigned solution areas, how each solution addresses business challenges, competitive information to identify how our solution stands apart, and what challenges/limitations may be encountered.
  • Knowledge of FTP and HIPAA compliant ANSI formatted EDI transactions, and thorough knowledge of security in handling PHI secure Health data transactions.
  • Implemented Type 2 Slowly Changing Dimensions to maintain the historical data in Data mart as per the business requirements.
  • Proficient in using Informatica Data Explorer IDE 9.5, Informatica Data Quality IDQ 9.5 and Data Analyst.
  • Experience working with Java-J2EE on sections like Servelets, Spring, EJB, JSP Hibernate, JDBC, Web-sphere etc.
  • Involved in Data migration to a new server and after complete migration, safely decommissioning the existing production Servers .
  • Working with OBIEEE as a reporting tool for some of the projects that which run for some of the Medicaid and Medicare teams.
  • Involved in working with MDM project across the global enterprise through increased accuracy, reliability, and timeliness of business-critical data.
  • Worked with Informatica Power Exchange 8.6.1 which works in conjunction with PowerCenter9.1 to capture changes to data in source tables and replicate those changes to target tables and files.
  • Worked with Power Exchange for CDC techniques for relational database sources on UNIX, and Windows operating systems.
  • Created SAS Document for Configuration team to execute the workflows in the order to move the data from Dev box to QAT environments for testing purposes.
  • Exclusively worked with BI/QA teams in troubleshooting defects and escalating them to Architects and High level in case of defects in design and also worked in Data Quality Tickets for Performance Tuning in case of slowly running queries.
  • Worked with existing OWB and ODI data ETL and Data Integrator tools to better accommodate the data with the existing Databases into the new ones.
  • Regularly involved in RAD Methodology for one project, and also involved in Agile for other project, different SDLC methodologies for different projects.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.

Environment: Informatica Power center 9.1/8.6/8.1. ,Teradata R13, Oracle 10g, Teradata SQL Assistant 7.1, TOAD, Oracle SQL developer, Microsoft SQL Server 2005/2008, DB2, HP Quality center, XML, Flat Files, Windows, Remedy, Accurev, HP-UNIX, Autosys, Erwin Business objects XI Release 2, SAP BO.

Confidential, Seattle, WA

Sr. ETL Informatica developer- Teradata DWH

Responsibilities:

  • Strong Experience in implementing ETL Informatica for Efficient data loading from different sources oracle, TERADATA, SQL Server2005/2008, DB2 and Flatfiles.
  • Applied Transformation logic in various complexities in transforming and transferring the data into downstream Teradata EDW.
  • Implemented scripts Teradata Utilities like Fast LOAD, MultiLOAD, TPUMP, Fast EXPORT, TPT Scripts in scheduling the loads to EDW Teradata.
  • Worked with Global Temporary tables and Volatile tables and used secondary Indexes in joining two tables for fast retrieval of data, to improve performance and there by efficiency.
  • Involved in Coding PL SQL Scripts for the Insert, Update, and Delete Scripts for the data movement from Source to Staging and from Staging to Targets.
  • Involved in Data Migration Project to submit all the data to the Government for the ATT deal with Confidential to Merge the Company and Acquire it.
  • Extensively worked with Teradata Development team in better integration of data coming from different data sources and improve performance in loading data using Utilities like BTEQ, Fast-Load, Multi-Load, Fast-Export, TPUMP and TPT scripts.
  • Extensively worked with staging area to perform data cleansing, Data profiling, data filtering and data standardization process.
  • Designed and developed mappings using Source Qualifier, Expression, connected-Lookup and unconnected-Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner & Rank transformations.
  • Created SAS Document for Configuration team to execute the workflows in the order to move the data from Dev box to QAT environments for testing purposes.
  • Experience working with Sales Force Dot com(SFDC) integration.
  • Exclusively worked with BI/QA teams in troubleshooting defects and escalating them to Architects and High level in case of defects in design and also worked in Data Quality Tickets for improving the performance in case of slowly running queries.
  • Extensively worked in Agile Methodology in all phases of Project.
  • Experience working with Java-J2EE on sections like Servelets, Spring, EJB, JSP Hibernate, JDBC, Web-sphere etc.
  • Worked on Java Scripting in some applications that helps the main site of the company.
  • Knowledge of FTP and HIPAA compliant ANSI formatted EDI transactions.
  • Worked with Business persons and good experience in collecting the requirements, looking at the needs of business suggesting the best strategies to adopt in development.
  • Defined UNIX Batch scripts for automation of execution ofInformaticaworkflows.
  • Expertise in working with Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Tables, Constraints, Views, Indexes, Sequences.
  • Exclusively worked on changing the legacy servers and migrating the code to new servers and changing the operating systems.
  • Developed SQL overrides in Source Qualifier and Lookup transformations according to business requirements.
  • Understanding, utilizing and communicating best practice methodologies internally and externally.
  • Working proactively with Systems Engineers, Technical Architects and other Solutions consultants to strategize on opportunities, cross-training and knowledge transfer in DataFlux Technologies.
  • Analyzes technical system problems, and designs and implements effective solutions for existing applications in the Area of DF.
  • Used connected /Unconnected Lookup Transformation to lookup the values in Flat Files and relational sources and targets for Full load and incremental Load strategies.
  • Implemented Type 2 Slowly Changing Dimensions to maintain the historical data in sales Data mart as per the business requirements.
  • Worked in Speech Integrated Voice Recognition Projects (SIVR) in creating different codes for automated answering system, in effectively handling the Large volume of customers calling the care and effectively reducing the Care load and providing the data through SIVR and handling the call information to EDW.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.

Environment: Informatica Power Center 6.1/5.1, Oracle9i, SQL*PLUS, TERADATA V2R5, UNIX, SQL SERVER 2005, TOAD, Business Objects, Autosys.

Confidential, Baltimore, MD

Sr.Informatica Developer/ Teradata

Responsibilities:

  • Developed complex Informatica mappings with various transformations
  • Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center 8.6.1.
  • Understand business requirements and convert them into a technical design document.
  • Understand the Data Model and Structure of the source and destination Database. Created business rules and validations that are derived from the business requirements.
  • Designed various mappings (Source-to-Target) using DataStage to link between different source systems and destination systems.
  • Used Data Stage Manager Utilities in designer for importing metadata from repository.
  • Created controlling sequencer jobs using the Data Stage Job Sequence and created new sequential file definitions and importing/exporting jobs.
  • Used DataStage Director to validate, run and monitor the DataStage jobs.
  • Used Lookup, Join and Merge stages for joining various information and also used Parallel Transformer, Column Generator, Funnel, Filter, Switch, Modify, Pivot and Row Generator.
  • Responsible for performance tuning of the ETL process and upgrading the ETL best practices document.
  • Effectively worked with tuning SQL queries and Procedures in order to improve the performance of the jobs.
  • Analyzed, Designed and Implemented the ETL architecture and generated SSRS reports.
  • Worked with OWB and ODI ETL and Data Integration tools to Pre-built modules for CDC, bulk loading etc.
  • Data Integrity Controls create a data “firewall” and Reduces data prep time by not processing erroneous data
  • Help determine and recommend workflow for raw property file conversion, quality control, data standardization and cleansing along with DF.
  • Create and maintain various reporting tools that will be necessary to profile the data for the Assessment team and the Data Quality team.
  • Maintain ownership of a plan for parallel processing while awaiting the delivery of a replacement system.
  • Worked with Source System Analysts, developers and business owners to better identify data sources for defining data extraction methodologies.
  • Migrating Data from Internal Legacy DEVICC Servers and Legacy Test ICC servers to different High Configuration Servers to improve performance and handle greater loads.
  • Experience in writing PL SQL coding for complex Stored Procedures for some of the Complex calculation in databases.
  • Involved in Data Migration to take over the small company which Geico Acquired for $34M.
  • Experience in working with MDM by improving the tracking, transparency, and auditing of financial data
  • Designed the metadata tables and created mappings to populate the same. These tables were used to generate the parameter files.
  • Created mappings to load data from Flat Files and relational sources into staging tables, and to Enterprise Data Warehouse by transforming the data according business Rules and transformations & Populating the Data Mart with only required information.
  • Worked on Java Scripting in some applications that help the main site of the company.
  • Used all Transformations such as Expressions, Filters, Joiners, aggregators, Lookups(connected and Unconnected), Update strategy, Sequence Generator, Routers and XML to load consistent data into Database.
  • Involved in writing SQL Stored procedures and Shell Scripts and Perl Scripts to access data from Oracle, and MySQL.
  • Created reusable transformations to clean the data, which were used in several mappings.
  • Developed and tested stored procedures, functions and packages in PL/SQL.
  • Extracted data from various sources like MS SQL Server 2008, DB2, flat files, Excel spreadsheets, Oracle and XML files and loaded into the oracle database.
  • Designed and implemented stored procedures and triggers for automating tasks in SQL Server 2005/2008, Oracle 10g.
  • Developed flowcharts for complex stored procedures.
  • Development of an automated process for loading the data into the base tables in EDW (for ETL).
  • Created and Executed workflows and Worklets using Workflow Manager to load the data into the Oracle Database.
  • Developed custom interface for executing and managing SSIS packages.
  • Developed mappings using SSIS and loaded data into target systems for some of the projects that were developed using MS SSIS, SSRS,SSAS technologies.
  • Maintained Versions of SQL Scripts, Documentation using StarTeam/Accurev.
  • Worked with Informatica Administrator to transfer project folders in development, test and production environments.
  • Involved in creation of various Unix Scripts and Perl Scripts which help the ETL scheduling jobs and help in initial validations of various tasks.
  • Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling the jobs (work flows)
  • Upgraded the systems to 8.6 from 8.1.1 with the help of Configuration team and extended support in resolving the issues for the upgradation.
  • Generated ad hoc reports using Business Objects and SSRS.
  • Used Informatica Version Control for checking in all versions of the objects used in creating the mappings, workflows to keep track of the changes in the development, test and production environment.

Environment: Informatica Power Center 6.1/5.1, ERWIN 3.5, Oracle9i, SQL*PLUS, TERADATA V2R5, UNIX, SQL SERVER 2005, TOAD, Business Objects, Autosys.

Confidential, Charlotte, NC

Informatica Developer

Responsibilities:

  • Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Lookup, Filter, Joiner, Rank, Router, Update Strategy and XML. Developed various Ad-hoc mappings for various business needs.
  • Worked with Teradata team in loading the data using relational connections and using different utilities to load the data like BTEQ, Fast Load, Multi Load, Fast Export, TPUMP, TPT scripts.
  • Worked with Global Temporary Tables, volatile tables and created secondary Index in joining tables to improve performance by joining two t
  • Performed extraction, transformation and loading of data from RDBMS tables, Flat Files, SQL into Teradata in accordance with requirements and specifications.
  • Understanding business needs and implement the same into a functional database design.
  • Developed new existing Informatica mappings and workflows and updating the old ones with the additional columns and deleting the ones which are not useful and moving some columns to other tables based on specifications are developed.
  • Created SQL joins, sub queries, tracing and performance tuning for better running of queries in SQL Server 2005.
  • Created and Configured Dimensions, Attributes and Hierarchies.
  • Developed and tested all the backend programs, Error Handling Strategies and update processes.
  • Created mappings using the transformations like Source Qualifier, Aggregator, Expression, Look Up, Router, Filter, Update Strategy, Joiner, Sequence Generators and Stored Procedure.
  • Writing Stored Programs (Procedures & Functions) to do Data Transformations and integrate them with Informatica programs and the existing application
  • Experience in using Normalizer transformation for normalizing the XML source data.
  • Involved in Data Migration Project to change the existing Server to a new one.
  • Extensively used XML transformation to generate target XML files.
  • Developed Perl Script for loading of Data from Source to Target.
  • Created, scheduled, and monitored workflow sessions on the basis of run on demand, run on time, using Informatica Power Center workflow manager.
  • Used Erwin data modeling tool to do conceptual/logical/physical data models.
  • Developed Unix shell scripts to scheduling Scripts by using scheduling tools Autosys.
  • Designed Excel Sheets for each mapping of their Test Scenarios.

Environment: Informatica Power Center 8.6.1, Teradata V2R5/V2R6, SQL Server 2005/2008, T-SQL, Toad, Oracle SQL developer, Quality Center,Windows2008, UNIX, Perl Scripting, Cognos, Shell script, Autosys.

Hire Now