We provide IT Staff Augmentation Services!

Sr. Informatica Cloud Developer Resume

2.00/5 (Submit Your Rating)

Plano, TX

PROFESSIONAL SUMMARY:

  • 9 years of experience in Information Technology as Data Architect and Informatica ETL Architect(Cloud and Onprem).
  • Worked as an Informatica Admin, Informatica Developer, Lead and Architect in various domains in Data Warehouse/Data Mart development using ETL / Informatica Power Center, Power Exchange, B2B, IDQ, TDM, Informatica Analyst, Informatica Intelligent Cloud Services.
  • Excellent exposure in working with Tableau, Qlik Sense, MicroStrategy, Cognos and Data stage tools.
  • Extensive work experience with processing Web Service Sources (WSDL) with web consumer transformation and Mainframe sources.
  • Very good experience in writing complex Teradata queries, including the use of stored procedures, functions and triggers to implement business rules and validations in various environments.
  • Working experience using Team Foundation server.
  • Actively participated in Informatica deployment activities from lower environments to QA, Production environments.
  • Contributed and actively providing comments for user stories review meeting within an AGILE SCRUM environment.
  • Designed and developed stored procedures using PL/SQL (Functions, Stored procedures) and tuned SQL queries for better performance.
  • Created and used reusable Transformation, Mapplets using Informatica PowerCenter.
  • Implemented slowly changing dimension (SCD Type 1 & 2) in various mappings.
  • Used Incremental Aggregation technique to load data into aggregation tables for improved performance.
  • Involved in Informatica Integration server performance improvement with port exhaustion, clearing cache, clearing the huge log files in Informatica server.
  • Improved the performance of Informatica mappings with Partitioning, Indexing the SQL queries in Source qualifier, lookup. Providing Sorting data to aggregator, Joiner transformation to improve the performance of code.
  • Hands on Experience on Informatica Repository Manager Client and creating deployment groups for code deployment between DEV, TEST, QA and Production environments.
  • Hands on experience in batch scripting in windows and UNIX commands.
  • Hands on experience with Control M scheduling tool to schedule Informatica Integration jobs.
  • Experience in Dimensional Data Modeling using Star Schema & Snowflake Schema, FACT & Dimensions tables, Slowly Changing Dimensions (Type I & II), Physical & logical data modeling, surrogate key assignment and CDC (change data capture).
  • Configured Informatica Power Center Integration services to extract and load Relational, Non - Relational, and Changed Data during Batch, Change and Real- Time basis.
  • Experience in integration of various data sources like Oracle, Netezza, Sybase, SQL Server, MS Access, Flat Files (csv files, text files), XML files, Web service and Salesforce, etc.
  • Expertise in working with various operational sources like DB2, SQL Server, Oracle, Teradata, Flat Files into a staging area.
  • Developed mappings, tasks using Information cloud real time (ICRT) to synchronize the data from Salesforce to ODS (database).
  • Hands on experience in Database development skills using Oracle PL/SQL to write Stored Procedures, Functions.

TECHNICAL SKILLS:

ETL/BI Tools : ICS,IICS,Informatica Power Center 10.1, 9.6/9.1.0/8.6.1/8.0/7.1, Microstrategy, Cognos, Data stage.

Databases : Teradata, MS SQL Server 6.5/7/2000/2005/2008, Oracle 12c,11i,8.1/9i/10g/6.0, MS Access, MySQL.

Query Tools : Toad 7.4/8.3.6, SQL Query Analyzer, SQL Plus

Modeling Tools : Erwin 4.0, ER studio

Languages : Siebel script/VB Script, Java, JavaScript, C, C++, HTML, Visual Basic, ASP, XML

Project Management tools : Microsoft Project Server.

GUI : Siebel Versions 6.0.3/7.5.3/7.8

Reporting Tools : Business Objects, OBIEE and Business Intelligence

Others : Visual Source Safe, Heat, MKS Tool kit

Operating Systems : Windows (NT, 95, 98, 2000, 2007), Windows XP, Windows 7, MS DOS

PROFESSIONAL EXPERIENCE:

Confidential

Sr. Informatica cloud developer

Responsibilities:

  • ETL Architect to build and enhance Enterprise Data Warehouse for analytics using IICS.
  • Designed and implemented ETL using Informatica Intelligent cloud services (IICS) to load data into AWS-REDSHIFT cloud database from files received from different vendors.
  • Created mappings, tasks, task flows for the data migration from files to Redshift database in IICS.
  • Migrated IICS code from one secure agent to another secure agent.
  • Resolved challenges faced while implementing the S3 connecters in IICS.

Environment : Informatica Intelligent cloud services, AWS-S3, Redshift

Confidential, Plano, TX

Sr. Informatica cloud developer

Responsibilities:

  • Creating the high-quality mapping documents which can be easily understandable by the development team.
  • Designed and implemented Data Synchronization from Oracle 12c to Snowflake database in Informatica intelligent cloud services (IICS).
  • Designed and implemented ETL using Informatica Intelligent cloud services (IICS) to load data into Snowflake cloud database from vertica database.
  • Created mappings, tasks, task flows for the database migration from vertica to snowflake in IICS.
  • Migrated IICS code from one secure agent to another secure agent.

Environment : IICS, AWS-S3, Snowflake, Vertica

Confidential, TX

Sr. Informatica Consultant

Responsibilities:

  • Sr. Data Architect to provide Architecture for projects on Netezza 3NF models for Salesforce and Amazon Redshift, Twitter feeds, and other systems
  • Worked on customer data cleansing project from Salesforce using Informatica IDQ to get single truth of customer
  • Provide High Level Design and Low-level design designs for Enterprise projects across multiple platforms.
  • Worked on building Conceptual, Logical and Physical Models on ERStudio
  • Maintained business glossary and metadata for the data architecture and data models.
  • Conducted Data Architecture meetings responsible from Business Requirement gathering, Business Analysis, Design and Development, testing and implementation of business rules.
  • Implemented best practices, code reviews and mentor team members with new standards and provide technical information in case of any performance issue, production incidents etc.
  • Developed Informatica mappings to load different data format files into Database.
  • Designed Solutions in various integrations at enterprise level as mentioned below.
  • Migrated objects from IDQ to PowerCenter.
  • Data Architect, Design and lead in design of data models and development of Informatica IDQ in real time which takes xml’s and parse them and load them to 100+ tables.
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Created business rules in Informatica Developer and imported them to Informatica power center to load the standardized and good format of data to staging tables.
  • Installed and configured Informatica Analyst tool.
  • Deployed code from Informatica Developer to Informatica Power Exchange because of the restrictions in Informatica Developer
  • No-dupes were implemented for batch process.
  • Data Architect, Design and develop data loading programs to support real time data loads to Netezza Data Marts
  • Integrated Informatica with Salesforce to load customer data into EDW using Informatica Cloud Services.
  • Performed detailed analysis of business and technical requirements and designed the solution by customizing various standard objects of Salesforce.com (SFDC)
  • Performed bulk loads from Salesforce SFDC thru Informatica to Teradata EDW.
  • Sales force data is crucial and has security concerns in order to be secure while transfer used encryption patterns to be secure.
  • Design and lead in development of IDQ mappings using various transformation like Labeler, Standardization, Case Converter, Match and Address Validation Transformation.
  • Used Address validators to prepare master data for Customers and Vendors.
  • Presently working on implementing the new DataMart and third-party vendor Tlog’s (JSON files) to get the sale amounts with in the Confidential thru Informatica IDQ.
  • Created design for the parsed data to load into the DWH tables with the complex transformations, cleansing the data also implemented pushdown optimization at different levels of the mapping.
  • Experienced in designing and development of Tableau visualizations and created business requirement documents and plans for creating dashboards.
  • Designing, creating and publishing customized interactive reports, dashboards and scheduling using Tableau Server.
  • Data Architecture, ETL Architecture to modify existing data model and ETL in Informatica IDQ to load 120+ tables handled in one mapping which involves 300 million records.
  • Improved the ETL window for the 120+ tables from 20hrs to 4 hours.
  • Worked on implementing AWS and RedShift into the Informatica environment.
  • Designed complex mappings to load into RedShift tables.
  • Data Archival is done instantly with live stream into S3 bucket thru Unix Scripts because of the restrictions, which we have with the S3 adapter with the Informatica.
  • Tuned Informatica workflows for better performance.
  • Data Architecture and design data models to consume twitter data feeds for Wholefoods # tags
  • Worked on Informatica mapping to get the tweets with name wholefoods and remove duplicates and loaded to Google Big query.
  • Worked on web Services consumer to get tax information from 3 rd party and send across stores.
  • Created CDC (Change Data Capture) mappings using Power Exchange for Sale transactions.

Environment: Netezza, ER Studio, Informatica 9.6/8.6/7.1, ICS, Perl, DB2, Unix Shell scripting, Tableau, Mainframes, Power Connect, Micro Strategy, Informatica Developer, RedShift, Amazon AWS, Tidal, SVN.

Confidential - IL, Chicago

Sr.ETL Informatica Lead

Responsibilities:

  • Lead of core ETL team involved in gathering requirements, performing source system analysis and development of ETL jobs to migrate data from the source to the target DW and build Data Architecture
  • Analyzed the business requirement document and created functional requirement document mapping for all the business requirements.
  • Worked on files coming from Mainframe System to load COBOL VSAM files into DWH thru Informatica Power Center.
  • Implemented a mapping in Informatica IDQ which takes xml’s as source and parse them and load them to 50+ tables. (This is the enterprise level queue which gets all the CDC for the Confidential across U.S region)
  • Data Architecture on ER Stuido Designed and created new schema - tables, views, and users, privileges in Red shift, following the general format of the existing tables, allowing for new models and factors in the future.
  • Develop mappings to load data to RedShift tables thru Informatica.
  • Monitored the performance of the designed process and implemented various performance techniques to improve efficiency.
  • Incorporated identified factors into Ab Initio mappings to build data mart
  • Migrate/upgrade Informatica version from 8.X and 9.X
  • Created and configured Ab Initio scripts and wrapper scripts as Autosys job
  • Experienced in designing and development of Tableau visualizations and created business requirement documents and plans for creating dashboards.
  • Design a generic Informatica process to read various source systems data and load to SAP clients.
  • Very good understanding of installing Cloud Adapters to Informatica environment.
  • Designed and provided solution in Informatica to read from Java message queues in realtime.
  • Designed and Developed ETL mappings using transformation logic for extracting the data from various sources systems.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
  • Designed process to load into AWS from DWH for legacy data older than 5 years.
  • Designed and lead team in developing PII process and maintained customer data privacy using Informatica TDM.

Environment: Informatica Power Center 9.6, Micro Strategy, Tableau , Informatica Power Connect, Flat files, XML Files, Oracle 10g/9i, DB2, MS SQL Server 2000, Netezza , Teradata, Shell Programming, SQL * Loader, ER Studio, Toad, Excel and Unix scripting, Sun Solaris, Windows XP, AutoSys, SVN.

Confidential - Watertown, Boston

Sr.ETL/ Informatica Lead developer

Responsibilities:

  • Analyze requirements provided by various business users.
  • Create Source and Target Definitions in the repository using Informatica Source Analyzer and Warehouse Designer.
  • Identify source systems, connectivity, tables, and fields to ensure data suitability for mapping.
  • Created different levels of stored displays to help different groups, users and executives to easily understand the data model per their requirements.
  • Collection of data from source systems into database.
  • Using Report Builder Template built model reports in HTML, Text, and RTF format
  • Developed the Excel based reusable component to generate SAP Ready flat files from the database directly and saved huge development effort.
  • Design, develop and document Informatica mappings as per the requirements.
  • Extensively created mapplets, common functions , reusable transformations, look-ups.
  • Created and implemented PRE/POST scripts at both schema and table level.
  • Implement the error handling and exception reporting as per the user requirements.
  • Performed integrated testing for various mappings. Tested the data and data integrity among various sources and targets.
  • Creates Job/schedule them for Informatica sessions and prepared standard documents for Tidal Job creation.

Environment: Informatica Power Center 9.6, Informatica Power Connect, Flat files, XML Files, Oracle 10g/9i, MS SQL Server 2000, Shell Programming, SQL * Loader, Toad, Excel and Unix scripting, Sun Solaris, Windows XP, AutoSys

Confidential - Regina, SK

Sr. Informatica ETL developer

Responsibilities:

  • Worked on a total of 70 mappings, 10 stored procedures.
  • Develop, test and maintain ETL procedures employing both ETL tools and custom PL/SQL
  • Participated in the development and maintenance of a Date Warehouse / Data Mart Routine Load Schedule.
  • Performed production support duties on a 24/7 basis.
  • Involved in business requirements gathering to enable Data Integration across all business groups.
  • Extensively used PL/SQL to embed complex business logic in stored procedures.
  • Extensive involvement with the Quality Assurance team for building exhaustive set of test cases.
  • Implemented logic to control job dependencies between ETLs solely using event-raise and event-wait tasks and entries made by ETLs in pilot database tables.
  • Used most of the transformations such as the Connected & Unconnected lookups , Filters, Routers, Joiners, Stored Procedure transformations & Sequence Generators.
  • Worked on making session runs more flexible using mapping parameters and variables and used parameter files and variable functions to manipulate them.
  • Configured the mappings to implement Slowly Changing Dimensions (Type- 2 ).
  • Defined Target Load Order Plan and Constraint based loading.
  • Wrote DOS scripts to merge files, append timestamp to filenames and move files.
  • Worked on data extraction from SAP system and data load to SAP using Informatica and worked with the customer master MRM tool Siperian
  • Experienced with Teradata utilities Fast Load, Multi Load , BTEQ scripting, Fast Export, SQL Assistant.
  • Implemented entire ETL job plan (50 jobs) through Informatica Scheduler.
  • Formulated procedures and policies required for data management

Environment: Windows 2000, Informatica Power Center 9.0, Tera Data 13.x, SQL Server 2008, Oracle 9i, SQL Server, SQL*Plus, TOAD, IBM Rational Clear Case versioning tool.

Confidential - Nicolet, QC

Sr. Informatica Admin

Responsibilities:

  • Studied and Analyzed the Business requirement and prepared scope document, ETL specification documents according to the business requirements.
  • Designed Sources to Targets mapping using Informatica Power Center for their Claims department. Created Reports using Business Objects.
  • Conducted design review meetings with the architect for any new requirement to get the approval and designed the mappings according to the company standards.
  • Used MS-Visio to populate the Informatica design flows.
  • Assisted in Informatica Administration using Repository manager.
  • Developed Complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up (connected and unconnected), Expression, Aggregate, Sequence Generator, Joiner, Union, Filter, Update Strategy, Rank and Router transformations.
  • Worked with Informatica Labels in labeling the designed components for deployment of code to TEST and PROD.
  • Well versed with Informatica Team based development (Versioning)
  • Used Informatica versioning to check in and check out the objects in designer and workflow manager.
  • Troubleshoot problems by checking sessions and error logs. Also used debugger for complex problem troubleshooting.
  • Wrote complex SQL queries to test the data generated by ETL process against target database.
  • Used Golden and also Toad to access the Oracle tables execute SQL queries.
  • Designed complex mappings involving target load order and constraint based loading.
  • Performance tuning, bug fixing and error handling.
  • Created parameter files for the all the designed mappings and used parameter files for defining all the connection values (Source, Target, Lookup) and mapping parameters used in the Informatica sessions.
  • Used Delta loading to extract New Claims in Informatica and processed the data and loaded the target data to Oracle tables.
  • Used SAP BO for reporting purposes.
  • Designed Technical documents, ETL Unit test plan document, Migration Check list document.

Environment: Informatica Power Center 8.1/8.6, (Source Analyzer, Data warehouse designer, Mapping Designer, Mapplet Designer, Transformations Developer, Workflow Manager, Workflow monitor, Repository Manager), Business Objects XI, Oracle 11g, SQL Server 2008, Erwin 4.5, Oracle 10g/9i, MS Visio, Windows, ICD / CPT codes

We'd love your feedback!