We provide IT Staff Augmentation Services!

Senior Informatica Developer Resume

2.00/5 (Submit Your Rating)

Roseville, CA

SUMMARY

  • Eight plus years of experience in IT Industry focusing on Business Analysis, Requirements, Development, Testing, Deployment and Administration of Data Warehousing using industry accepted methodologies and procedures. Expertise in Business Intelligence Utilizing tools like Informatica PowerCenter, Business Objects, QlikView.
  • Designed and Developed the Data warehouses and Datamarts conceptualized on Bill inmon and Ralph Kimball methodologies
  • Constantly interacted with business analysts to understand the reporting requirements
  • Analyzed source and target systems and prepared technical specification for the development of Informatica ETL mappings to load data into various tables.
  • Ensured Service Level Agreements (SLA) with business are met or exceeded and escalated internally or externally when necessary for resolution of day - to-day operational production support activities.
  • Designed and Developed Business Objects universes/reports to address business needs.
  • Administered the Informatica repository by creating and managing user profiles and metadata
  • Administered the Business Objects by creating and managing user profiles and metadata
  • Designed dimensional models/star schemas to populate the Datamarts across the Enterprise.
  • Established working relationship with various vendors that helped in successful delivery of the projects.
  • Lead a team of Junior Developers/Consultants to deliver a critical financial project for the company.
  • Established coding standards/naming standards in various technologies like Informatica, Business Objects, MS SQL Server.
  • Used Informatica Power Connect for SAP to pull data from SAP R/3.
  • Experience in Creating ABAP Mapping in Power center with PowerConnect for SAP.
  • Designed and Developed routines using Data Quality project architect for MDM.
  • Experience in creation and maintenance of entity objects, hierarchies, entity types, relationship objects
  • Relationship types using Hierarchy tool to enable Hierarchy Manager (HM) in MDM HUB implementation.
  • Designed and Developed solutions to protect Private information by Encrypting and decrypting the data movement from the company to the vendor and vice versa.
  • Designed and developed a strategy to populate the subject area hierarchies into data warehouse and datamarts according to the reporting needs.
  • Designed and developed the data transformations for source system data extraction, data staging, data movement and data aggregation.
  • Experience working on Informatica MDM to design, develop, test and review & optimize Informatica MDM
  • Hands on data profiling tools like data flux, Informatica
  • Implemented error routines to handle incorrect data.
  • Extracted data from ERP sources using Informatica PowerConnect
  • Expertise in performance tuning at Target, source, mapping, session, system levels.
  • Developed Informatica mapping to capture data changes from the operational source systems into data warehouse and datamarts (slowly changing dimensions).
  • Developed Mapplets/transformations embedding business logic
  • Created source and target partitions to concurrently load the data into data warehouse and datamarts.
  • Created shared folders, local and global shortcuts to propagate changes across the repository
  • Used SQL tools like TOAD, Query Analyzer to run SQL queries and validate the data in warehouse and mart
  • Developed shell /python scripts to handle incremental loads
  • Experience in scheduling Informatica sessions, job dependencies using AUTOSYS.
  • Experience in designing and developing the Universe(business view of the database), developing canned/ad-hoc reports, scheduling the processes using Broadcast Agent and administering the BO activities
  • Created reports using Business Objects functionality like Combined Queries, Slice and Dice, Drill Down, Functions, Cross Tab, Master Detail and Formulae etc
  • Experience in developing Packages, procedures, triggers and functions using PL SQL, T-SQL.
  • Automated the cloud deployments using chef, python and AWS Cloud Formation Templates.
  • In depth Knowledge of AWS cloud service like Compute, Network, Storage and Identity & access management.
  • Hands-on experience in Microsoft Azure Cloud Services (PaaS & IaaS), Storage, Web Apps, Active Directory, Application Insights, Internet of Things (IoT), Azure Search, Key Vault, Visual Studio Online (VSO) and SQL Azure .
  • Used GCP, BigQuery, Google cloud Storage, compute engine, Airflow/dags, composers. Looker for data warehousing.

TECHNICAL SKILLS

Data Warehousing: Informatica Power Center 10.2.0/10.1.1/9.6.1/ 9.5/9.1.0/8.6.1 /7.1/6.2 , Designer, Workflow Manager, Workflow Monitor, Repository Manager, Informatica Power Connect, DTS, SQL Server Integration Services SSIS, Webservices, Informatica MDM, Informatica IDQ, IDD

Data Modeling: Erwin, Toad

Databases: Oracle11i/10g/9i/8i/7x, SQL Server 2003/2008, MS Access, Excel, salesforce.com.

Cloud Tools: AWS, S3, EC2, Oracle RDS, GCP, BIG QUERY, GCS, LOOKER, AIRFLOW/ Dags

Business Intelligence: Hyperion, OBIEE, Cognos Impromptu, Transformer, Power Play Reports, Scheduler, IWR, PPWR, Upfront, Access Manager, Business Objects XI/6.5 Supervisor, Designer, Reporter, SQL Server Analysis Services SSAS, SQL Server Reporting Services SSRS, Crystal Reports 9/10/11.

Languages: SQL,T-SQL,ANSI-SQL, PL/SQL, Unix Shell Script, Visual Basic, ANSI SQL, SQL Plus 3.3/8.0

Tools: Toad, SQL Loader, Crystal Reports 9.0/8.x/7/6/5/4

Operating Systems: Windows 2007/2005/2003/ XP/NT Server and Windows NT, UNIX, HP-UX, UNIX AIX 4.2/4.3, Sun Solaris 8.

Analytical Tools: SQL Server Analysis Service SSAS, Performance Point Server 2007

PROFESSIONAL EXPERIENCE

Confidential, Roseville, CA

Senior Informatica developer

Responsibilities:

  • Worked as Lead for the projects, involving in all the phases of SDLC.
  • Worked with DataArchitect in designing the data mart, defining,designing and building FACTS and DIMENSIONS using Star Schema model.
  • Developed Informatica workflows/worklets/sessions associated with the mappings across various sources like XML, COBOL, flat files, Webservices, Salesforce.
  • Work closely with our architects and engineers to recommend and design database or data storage solutions that effectively reflect our business needs, security, and service level requirements.
  • Created reusable components, reusable transformations and mapplets to be shared among the project team.
  • Developed mappings using different transformations like Source Qualifier, Expression, Lookup (Connected & Unconnected), Aggregator, Router, Rank, Filter and Sequence Generator.
  • Investigate and fix the bugs occurred in the production environment and providing support. Review the errors, log files, restudy/recode of the system, debugging and Trouble Shooting of Informatica Mappings.
  • Worked with ETL Developers in creating External Batches to execute mappings, Mapplets using Informatica workflow designer to integrate Shire’s data from varied sources like Oracle, DB2, flat files and SQL databases and loaded into landing tables of Informatica MDM Hub.
  • Configured match rule set property by enabling search by rules in MDM according to Business Rules.
  • Worked on data cleansing using the cleanse functions in Informatica MDM.
  • Worked with SCD Type1, Type 2, Type3 to maintain history in Dimension tables.
  • Worked with cleanse, parse, standardization, validation, scorecard transformations.
  • Worked with transformations Source Qualifier,Update Strategy, XML transformation, SQL Transformation, Webservices, Java transformation,Lookup Connected and Unconnected .
  • Worked with Push down optimization to improve performance.
  • Worked in parsing XML files, java API connectors from data stage, and using custom build stages
  • Worked on UNIX shell scripting for file processing to third party vendor through SFTP, encryption and decryption process.
  • Experience in configuring, deployment and support of cloud services including Confidential Web Services ( AWS ) .
  • Participate in planning, implementation, and growth of our customer's Confidential Web Services (AWS) foundational footprint.
  • Work with our current application teams to understand our current applications and make migration recommendations and to-be architectures in AWS
  • Review coding done to advance application upgrade, extension, or other development. Analyze application for data integrity issues.
  • Work with team to build out automation templates in Fugue or AWS Cloud Formation in support of the managed services platform
  • Work with application and architecture teams to conduct proof of concept (POC) and implement the design in production environment in AWS
  • Used WInSCP, FIILEZILLA tool to verify Incoming and Outgoing files from our FTP Server.secure file transfer between a local and a remote computer
  • Used various UNIX/Shell commands to edit different types of files and move files in and out of UNIX server and to remove control junk characters in UNIX files.
  • Worked with XML Parser and XML Generator transformations using Informatica PowerCenter to parse and generate XML files.
  • Worked with different scheduling tools like Tidal, Tivoli, Control M, Autosys
  • In EDW we load the consignment related data base business rules, this data we extract from SAP source system.
  • Worked on ABAP Mapping in Power center with PowerConnect for SAP .
  • Documentation of ETL process for each project and KT to Offshore and team.
  • Work closely with DBAs, application, database and ETL developers, change control management for migrating developed mappings across Dev, QA, and PROD environments.
  • Production support for the Informatica process, troubleshoot and debug any errors.
  • Worked with Informatica tools IDQ Data Analyst,Developer with various dataprofiling techniques to cleanse,match/remove duplicate data.
  • Worked with cleanse, parse, standardization, validation, scorecard transformations.
  • Worked with Informatica power exchange and Informatica cloud to integrate Salesforce and load the data from Saleforce to Oracle db.

Environment: Informatica Power Center 10.2/10.1.1/9.6.1/9.5/9.1.0 , MDM, IDD Flat Files,MainFrame Files, T-SQL,Oracle 11i, Quest Toad Central 9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL Server 2005,SQL Server 2008,2012,2016, Teradata,Netezza, Cognos, GCP, GCS, AWS, BigQuery.

Confidential

Informatica developer

Responsibilities:

  • Involved in design, development and implementation of the ETL projects end to end.
  • Responsible for ETL technical design discussions and prepared ETL high level technical design document.
  • Involved in the analysis of source to target mapping provided by data analysts and prepared function and technical design documents.
  • Involved in designing of star schema based data model with dimensions and facts.
  • Interacting with onsite and offshore team to assign Development tasks and scheduling weekly status calls with Offshore team on status.
  • Extracted data from flat files, Golden Gate, Oracle, Sql server using Informatica ETL mappings and loaded to Data Mart.
  • Utilized Informatica IDQ Data Analyst,Developer for dataprofiling and matching/removing duplicate data,fixing the bad data,fixing NULL values.
  • Created quality rules, development and implementation patterns with cleanse, parse, standardization, validation, scorecard transformations.
  • Created complex Informatica mappings using transformations Unconnected Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator, Lookup and Router transformations to extract, transform and loaded data mart area.
  • Used Teradata utilities fastload, multiload, tpump to load data from various source systems.
  • Developed scripts in BTEQ to import and export the data.
  • Developed CTL scripts to load the data to and from Teradata database.
  • Worked extensively on shell scripting for file management.
  • Created re-usable transformations/mapplets and used across various mappings
  • Wrote complex PLSQL scripts /functions/procedures/packages.
  • Developed Informatica workflows/worklets/sessions associated with the mappings using Workflow Manager
  • Participated in the development and implementation of the MDM decommissioning project using InformaticaPowerCenter that reduced the cost and time of implementation and development.
  • Worked extensively with Netezza scripts to load the data from flatfiles to Netezza database.
  • Used NZSQL scripts,NZLOAD commands to load the data.
  • Created Tivoli Maestro jobs to schedule Informatica Workflows
  • Expert in performance tuning of Informatica code using standard informatica tuning steps.

Environment: Informatica Power Center 9.5/9.1.0/8.6.1 , Flat Files,MainFrame Files, Oracle 11i, Quest Toad Central 9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL Server 2005,T-SQL,SQL Server 2008, Teradata,Netezza,Aginity workbench.

Confidential

Informatica developer

Responsibilities:

  • Created Informatica mappings using various transformations like XML, Source Qualifier, Expression, Look up, Stored procedure, Aggregate, Update Strategy, Joiner, Normaliser,Union, Filter and Router in Informatica designer.
  • Worked with FLOAD,MLOAD,TPUMP utilities to load the data to Teradata.
  • Extensively worked with Teradata database using BTEQ scripts.
  • Involve in all phase of SDLC, i.e design, code, test and deploy ETL components of datawarehouse and integrated Data Mart .
  • Used NZSQL scripts,NZLOAD commands to load the data.
  • Worked extensively with Netezza scripts to load the data from flatfiles to Netezza database.
  • Created Web services mappings for consumer and Provider,used Webservices consumer transformation,XML parser to parse the incoming data.
  • Created and edited custom objects and custom fields in Salesforce and checked the field level Securities.
  • Worked on sfdc session log error files to look into the errors and debug the issue.
  • Worked on Informatica Cloud to create Source /Target sfdcconnections,monitor,synchronize the data in sfdc.
  • Worked with Informatica Power Exchange as well as Informatica cloud to load the data into salesforce.com
  • Involved in analyzing, defining, and documenting data requirements by interacting with the client and Salesforce team for the Salesforce objects.
  • Worked with cleanse, parse, standardization, validation, scorecard transformations.
  • Worked with Informatica IDQ Data Analyst,Developer with various dataprofiling techniques to cleanse,match/remove duplicate data.
  • Worked as Informatica Lead for ETL projects to design,develop Informatica mappings.
  • Created pre-session, post session, pre-sql, post sql commands in Informatica.
  • Used UNIX scripts for file management as well as in FTP process.
  • Work closely with DBAs, application, database and ETL developers, change control management for migrating developed mappings to PROD.
  • Production support for the Informatica process, troubleshoot and debug any errors.

Environment: Informatica Power Center 9.5/9.1.0/8.6.1 , Informatica Data Quality 9.1.0/9.5.1 Flat Files,MainFrame Files, Oracle 11i, Netezza,Quest Toad Central 9.1, Unix Shell Scripting and Windows 2000, Windows 2003,SQL Server 2005,SQL Server 2008

Confidential

Informatica developer

Responsibilities:

  • Worked closely with the Business analyst to understand the various source data.
  • Involved in designing Logical and Physical models for staging, transition of the Data.
  • Involved in designing of star schema based data model with dimensions and facts
  • Designed ETL mapping document to map source data elements to the target based in Star-Schema dimensional model.
  • Designed and developed Informatica Mapping for data load and data cleansing
  • Created Stored Procedure, Functions and Triggers as per the business requirements
  • Used Update Strategy and Lookup transformation to implement the Change Data Capture CDC process
  • Partitioned sources to improve session performance.
  • Developed several complex Mappings, Mapplets and Reusable Transformations to facilitate One time and Monthly loading of Data
  • Utilized the Aggregate, Join, Router, Lookup and Update transformations to model various standardized business processes
  • Worked with Scheduler to schedule Informatica sessions on daily basis and to send an email after the completion of loading
  • Created design documents and performed Unit Testing on the Mappings.
  • Created complex SCD type 1 type 2 mappings using dynamic lookup, Joiner, Router, Union, Expression and Update Transformation.
  • Worked on identifying Mapping Bottlenecks in Source, Target and Mappings and Improve performance.
  • Extensively used Workflow Manager to create connections, sessions, tasks and workflows
  • Performance tuned stored procedures, transformations, mappings, sessions and SQL queries
  • Worked on the Database Triggers, Stored Procedures, Functions and Database Constraints.

Environment: Informatica 7.1/6.x, Oracle 10g, SQL Server 2005, Autosys, Business Objects 6.5

We'd love your feedback!