We provide IT Staff Augmentation Services!

Etl Informatica Developer Resume

Kansas City, MO

SUMMARY

  • Over 7 years of IT Experience in Analysis, Design and Development in various business applications as an ETL developer with major contribution in Informatica B2B Data Transformation (DT) studio, Informatica Power Center 9.x/8.x, Informatica B2B Data Exchange (DX).
  • Experience working on Informatica MDM to design, develop, test and review & optimize Informatica MDM (Siperian).
  • Proper understanding/knowledge of Hadoop Architecture and various components such as HDFS, Hive, Pig, MapReduce, Flume, Kafka, Oozie. Strong knowledge of Hive, Pig, Spark Scala/Python, Data Frames and Data streaming. Extensive experience in Data warehousing, Data Architecture & Extraction, Transformation and ETL data load from various sources into Data Warehouse and Data Marts using Informatica Power Center.
  • Database experience using Teradata 14.10/14, PostgreSQL, H2 Database, Oracle 11g/10g/9i/8.x/7.x, MS SQL Server 2005/2000, AS400, DB2 and MS Access. Also used SQL Editors such as Teradata SQL Assistant, H2 Console, TOAD, SQL PLUS and SQL Analyzer
  • Knowledge in designing Active Directory, its implementation and administration.
  • Proficiency in Informatica administration like Installation, upgrade, migration, User management, back/restore, code promotion, OS Profiling, HA, GRID and LDAP etc.
  • Experience in migration of SSRS reports from SQL 2000 to SQL 2005 and SQL 2008 and SQL 2008 R2.
  • Involved in all aspects SDLC of ETL including requirement gathering, data cleaning, data load strategies, mappings design & development, providing standard interfaces for various operational sources, unit/ integration/regression testing.
  • Proficient in Data warehouse ETL activities using SQL, PL/SQL, PRO*C, SQL*LOADER, C, Data structures using C, Unix scripting, Python scripting and Perl scripting.
  • Extensively worked on Informatica B2B Data Exchange Setup from Endpoint creation, Scheduler, Partner setup, Profile setup, Event attributes creation, Event status creation, etc.
  • Extensive experience in delivering OLAP solutions by developing Corporate Dashboard Reports using SQL Server Reporting Services (SSRS), Report Model and Ad Hoc Reporting using Report Builder functionality.
  • Relationship types using Hierarchy tool to enable Hierarchy Manager (HM) in MDM HUB implementation.
  • Designing and implementing data warehouses and data marts using components of Kimball Methodology, like Data Warehouse Bus, Conformed Facts & Dimensions, Slowly Changing Dimensions, Surrogate Keys, Star Schema, Snowflake Schema, etc.
  • Experience in integration of various data sources like Teradata, Oracle, PostgreSQL, S3, HDFS, MS SQL Server, Flat Files, and XML Definitions.
  • Designed, Installed, Configured core Informatica/Siperian MDM Hub components such as Informatica MDM Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, Data Modeling.
  • •Experienced in integration of various data sources like Oracle 11g/10g/9i, IBM DB2, MS SQL Server, My SQL, Snowflake, Teradata, Netezza, XML files, Mainframe sources into staging area and different target databases.
  • Used Informatica BDM IDQ 10.1.1 (Big Data Management): To inject the data from AWS S3 raw to S3 refine and from refine to Redshift.
  • Extensive experience in various DB/DW/BI technologies including Tableau, OBIEE, MicroStrategy, SSRS, Unica, Google Analytics, Alteryx, Informatica, SSIS, Teradata.
  • Extensive success in translating business requirements and user expectations into detailed specifications employing Unified Modelling Language (UML).
  • Experience in writing expressions in SSRS and Expert in fine tuning the reports. Created many Drill through and Drill Down reports using SSRS.
  • Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirements.
  • Strong development skills with Azure Data Lake, Azure Data Factory, SQL Data Warehouse Azure Blob, Azure Storage Explorer
  • Working experience in Informatica PowerCenter in the all stages of design, development and implementation of data mappings, mapplets, sessions using Informatica PowerCenter 10.x, 9.x, IDQ, Oracle, SQL and Unix
  • Extensive experience in designing, developing and publishing visually rich and intuitively interactive Tableau workbooks and dashboards for executive decision making.
  • Benefit and enrollment specialist (834 FILE PROCESSING)
  • Experience in Micro Strategy Analysis, design and Implementation of BI applications using Micro Strategy.
  • Involved in analysis, design, development and implementation of Business Intelligence applications using Micro Strategy.
  • Installing, configuring of all Micro Strategy activities including Micro Strategy Desktop, Administrator, Intelligence Server, Web, SDK and Micro Strategy Narrowcast Server on NT Servers and mapping to Client machines.
  • Part of the team researching on data integration on Amazon Redshift
  • Expert in calculating measures and dimension members using MDX, mathematical formulas, and user defined functions.
  • Developed shell /python scripts to handle incremental loads
  • Strong business analysis skills and an understanding of the software development life cycle (SDLC).
  • IT experience in different tools and technologies like Informatica, Teradata, UNIX, Oracle and PL SQL.
  • Excellent experience in designing, modelling, performance tuning and analysis, implementing processes using ETL tool Informatica Power enter for Data Extraction, transformation and loading processes.
  • Designing end to end ETL processes to support reporting requirements. Designing aggregates, summary tables and materialized views for reporting.
  • Expertise in using heterogeneous source systems like Flat files (Fixed width & Delimited), XML Files, CSV files, IBM DB2, Excel, Oracle, Sybase and SQL.
  • Extensively built dashboards using techniques for guided analytics, interactive dashboard design, and visual best practices to convey the story inside the data using Tableau.
  • Experience in building Workflow solutions, Data integration and Extract, Transform and Load (ETL) Solutions for Data warehousing using DTS/SSIS packages from multiple sources like Tera - Data, DB2, Oracle, MS Access, Flat files, SQL Server etc., and loading them into target databases.
  • Expertise in Extraction, Transformation, loading data from DB2, AS400, SQL Server, Access, Excel, Flat Files and XML using DTS, SSIS.
  • Expertise in creating and managing Event Handlers, Package Configurations, Logging, System and User-defined Variables for SSIS Packages.
  • Expertise in Debugging, Error logging, Error Handling and Production support for SSIS.
  • Proficient in designing & developing complex mappings from varied transformation logic like
  • Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
  • Hands on experience in TriZetto Facets 4.71 and 5.10 Membership and Plan/Product business modules.
  • Hands on experience on Facets Database schema, Batch process, Enrollment flow using MEET.
  • Worked on Repository Manager, Workflow Manager, Workflow Monitor and Designer to develop
  • Mappings, Mapplets, Reusable Transformations, Tasks, Workflows, Worklets to extract, transform and load data.
  • Strong experience in client requirement analysis, physical, logical design development, resource planning, coding, debugging, testing, deployment, support and maintenance of business intelligence applications.
  • Expertise in broad range of technologies, including business process tools such as Microsoft Project, Pro model, MS Excel, MS Access, MS Visio, technical assessment tools and Data Warehousing concepts.
  • With a short learning curve and strong communication skills.
  • Expertise in Logical Modelling, Physical Modelling, Dimensional Modelling, Star and Snow-Flake schema
  • Used of Business Object to provide performance management, planning, reporting, query and analysis, and enterprise information management.
  • Installed Micro Strategy web universal, Micro Strategy Intelligent server in AIX environment.
  • Performing good coding practices, like naming conventions, code description, documentation and version control, and participated in different development and operation activities such as code review, pair testing, off-shore coordination and on-call support.
  • Experience in Batch and PowerShell Scripting in windows platform.
  • Familiar with Agile project management, exposed to corporation IT environments and communicating with professional manners.

TECHNICAL SKILLS

ETL Tools: Informatics 10.1.0/9.6.1/9.6/9.1 , Informatica Cloud, Informatica B2BInformatica PowerExchange, Informatica Big Data Edition 9.6, Informatica IDQ, MS SSIS2012/2008

RDBMS: Oracle 12c/11g/10g/9i, Teradata V 16.20/15.10/14/13/12 , DB2, SQL Server 2014/2012/2008 , MySQL, Sybase

Modeling: Dimensional Data Modeling, Star Schema Modeling, Snow - flake Schema Modeling, Erwin, E-R Modeling, Microsoft Visio.

Operating Systems: Windows, UNIX, Linux.

Reporting Tools: Dash Board Reporting, Tableau

Teradata Utilities: BTEQ, FastLoad, MultiLoad, TPT, TPump, SQL Assistant, Teradata Manager, Teradata Viewpoint

Languages: SQL, PL/SQL, XML, UNIX Shell Scripting, Perl

PROFESSIONAL EXPERIENCE

Confidential, Kansas City, MO

ETL Informatica Developer

Responsibilities:

  • Involved in gathering and analysis business requirement and writing requirement specification documents and identified data sources and targets.
  • Communicated with business customers to discuss the issues and requirements.
  • Designed, documented and configured the Informatica MDM Hub to support loading, cleansing of data.
  • Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
  • Informatica admin activities like creation of folders, Connections, user and assigning proper privileges.
  • Designed and created mappings to extract data from numerous sources (Oracle and SQL server DB’s, PostgreSQL, flat file loads) to stage in Data warehouse for reporting
  • Developed a Tabular Reports, ad-hoc reports using SSRS Report Designer.
  • Developed and deployed ETL job workflow with reliable error/exception handling and rollback within the MuleSoft framework.
  • Creation of B2B DX partners, profiles, workflows, endpoints for onboarding clients.
  • Worked on Split process, Merge process of multi- client files using Informatica B2B Data transformation and Informatica B2B Data Exchange.
  • Coordinate with support team on administrative issues relating to PowerCenter 9.1 environment.
  • Developed Spark scripts using Scala as per the requirement using Spark 1.5 framework.
  • Using Spark API’s over Cloudera Hadoop Yarn to perform analytics on data used for Hive stored at HDFS.
  • Developed Scala Scripts, UDFs using both Data frames/SQL and RDD in Spark for data aggregation, queries and writing data back onto HDFS.
  • Wrote Batch and PowerShell scripts to run jobs through windows task scheduler.
  • Working on Proof of Concepts to come up with possibilities of implementing Informatica B2B DT studio.
  • Developed complex SSRS reports using multiple data providers, Global Variables, Expressions, user defined objects, aggregate aware objects, charts, and synchronized queries.
  • Prepare administration and operations handbook for new ETL environment on PowerCenter 9.1.
  • Deployed reports, created report schedules and subscriptions. Managing and securing reports using SSRS.
  • Built ad-hoc reports using SQL Server Reporting Services (SSRS).
  • Exploring Spark to improve the performance and optimization of the existing algorithms in Hadoop using Spark context, Spark data frames, pair RDDs, double RDDs and Yarn
  • Created Hive tables and extensively worked with HiveQL for analysis, transformation and verification of data. Developed common ETL and written Python code to format XML documents which can help to source data from different platforms.
  • Created windows PowerShell and Batch scripts to delete windows server logs, to restart windows services and to execute T-SQL stored procedures in MS SQL Server.
  • Extensively worked on People Soft modules AP and Project Costing.
  • Managed Salesforce to Active Directory integration using Workato and Mulesoft Anypoint Platform
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Developed several IDQ complex mappings in Informatica a variety of Power Center, transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica Power Center.
  • Comply with the system development lifecycle (SDLC) and project management methodology - adopting Agile approaches as necessary
  • Developed common ETL and written Python code to format XML documents which can help to source data from different platforms.
  • Generated ad-hoc reports in Excel Power Pivot and sheared them using Power BI to the decision makers for strategic planning.
  • Utilized Power Query in Power BI to Pivot and Un-pivot the data model for data cleansing and data massaging.
  • Implemented several DAX functions for various fact calculations for efficient data visualization in Power BI.
  • Utilized Power BI gateway to keep dashboards and reports up to-date with on premise data sources.
  • Experience managing the system development life cycle, including scoping, requirements gathering, agile development, quality assurance, production support, and project planning are a mustCreated user variables, property expressions, script task in SSIS.
  • Implementing various SSIS packages having different tasks and transformations and scheduled SSIS packages.
  • Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ)
  • Used InformaticaPower Center to load data from different data sources like xml, flat files and Oracle, Teradata, Salesforce.
  • Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets.
  • Utilized of Informatica IDQ to complete initial data profiling and matching/removing duplicate data
  • Used relational SQL wherever possible to minimize the data transfer over the network.
  • Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Define/Design metadata management framework which can be leverage at enterprise level to manage enterprise metadata of all the sources systems.
  • Used SQL tools like TOAD to run SQL queries and validate the data.
  • Converted all the jobs scheduled in Maestro to Autosys scheduler as the per requirements
  • Worked on maintaining the master data using Informatica MDM
  • Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling the jobs (work flows)
  • Performed tuning of queries, targets, sources, mappings, and sessions.
  • Used Linux scripts and necessary Test Plans to ensure the successful execution of the data loading process
  • Worked with the Quality Assurance team to build the test cases to perform unit, Integration, functional and performance Testing.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads, and process flow of the mappings

Environment: Informatica Power Center 10.1, UNIX, SQL,Informatica B2B Data Transformation 9.6.1, Informatica B2B Data Exchange,Python scripting, IDQ, Informatica Administration, PowerShell scripts, .Net, PeopleSoft, IDE, CDC, MDM, Linux, Perl, WINSCP, Shell, PL/SQL, Netezza, Teradata, Microsoft SQL Server, and Microsoft Visual studio

Confidential, Lancaster, PA

ETL Informatica Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, and support for production environment.
  • Defined, configured, and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using Informatica MDM Hub console.
  • Creation of new environment and configuration and Administration and configuration of Informatica Servers.
  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • ETL implementation using SQL Server Integration Services SSIS, Reporting Services SSRS.
  • Performed data integrity checks, data cleansing, exploratory analysis and feature engineer using python and data visualization packages such as Matplotlib, Seaborn.
  • Used Python to develop a variety of models and algorithms for analytic purposes.
  • Used Python to implement different machine learning algorithms, including Generalized Linear Model, Random Forest and Gradient Boosting
  • Performed transformations, cleaning, standardization and filtering of data using Spark Scala/Python and loaded the final required data to HDFS.
  • Utilized Power BI (Power Pivot/View) to design multiple scorecards and dashboards to display information required by different departments and upper level management.
  • Created reports utilizing SSRS, Excel services, Power BI and deployed them on SharePoint Server as per business requirements.
  • Designed complex data intensive reports in Power BI utilizing various graph features such as gauge, funnel, line better business analysis
  • Developing reports on SSRS on SQL Server 2008/2016.
  • Created stored procedure for generating reports using SQL Server Reporting Services SSRS.
  • Implemented parameterized, cascading parameterized, drill-down, drill-through and sub-reports using SSRS.
  • Load the data into Spark immutable RDDs and perform in-memory computation to generate quick and better response.
  • Developed Power Shell scripts and batch scripts for file processing and cleansing.
  • Analyzing how the data been processed by Informatica can be effectively processed using Spark and its API’s.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Created Stored Procedures for data transformation purpose.
  • Involved in the Dimensional Data Modeling and populating the business rules using mappings into the Repository for Data management
  • Oversees overall technical solution and implementation and ensures system development and configurations support the defined business process
  • Create and maintain the necessary documentation to proceed with system development (i.e. proposals, test results, implementation documents, price quotes, purchase orders, system release notes, etc.
  • Involved in full SDLC from requirements gathering and data model requirements, development, testing and migration of data and Production support.
  • Created ETL process using SSIS to transfer data from heterogeneous data sources.
  • Created logging for ETL load at package level and task level to log number of records processed by each package and each task in a package using SSIS.
  • Worked on Informatica Power Center 9 x tools - Source Analyzer, Data warehousing designer, Mapping, Designer, and Mapplet & Transformations.
  • Extensively used ETL to load data using Power Center / Power Exchange from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
  • Created windows power shell scripts in cmdlets in in windows power shell environment.
  • Used Informatica power center and Data quality transformations - Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and normalizer, Standardizer, Labeler, Parser, Address Validator (Address Doctor), Match, Merge, Consolidation transformations to extract, transform, cleanse, and load the data from different sources into DB2, Oracle, Teradata, Netezza and SQL Server targets.
  • Created and configured workflows, worklets& Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Worked on Database migration from Teradata legacy system to Netezza and Hadoop.
  • Worked in building Data Integration and Workflow Solutions and Extract, Transform, and Load (ETL) solutions for data warehousing using SQL Server Integration Service (SSIS).
  • Used Teradata Utilities (BTEQ, Multi-Load, and Fast-Load) to maintain the database.
  • Built a re-usable staging area in Teradata for loading data from multiple source systems using template tables for profiling and cleansing in IDQ
  • Created profile and scorecards to review data quality.
  • Actively involved in Data validations and unit testing to make sure data is clean and standardized before loading in to MDM landing tables.
  • Used pipeline in windows powershell to enable one cmdlet to be piped into another cmdlet.
  • Actively involved in Exceptional handling process in IDQ Exceptional transformation after loading the data in MDM and notify the Data Stewards with all exceptions.
  • Worked on Autosys as job scheduler and used to run the created application and respective workflow in this job scheduler in selected recursive timings.
  • Generated PL/SQL and Shell scripts for scheduling periodic load processes.
  • Designed and developed UNIX scripts for creating, dropping tables which are used for scheduling the jobs.
  • Invoked Informatica using "pmcmd" utility from the UNIX script.
  • Wrote pre-session shell scripts to check session mode (enable/disable) before running/ schedule batches.
  • Involved in supporting 24*7 rotation system and strong grip in using scheduling tool Tivoli.
  • Involved in Production support activities like batch monitoring process in UNIX.

Environment: Informatica Power Center 9.6.1,Informatica B2B Data 9.5.1 Power Exchange,Informatica Administration,Python scripting, UNIX, Oracle, Linux, Perl, Shell, Powershell, MDM, IDQ, PL/SQL, Tivoli, Oracle 11g/10g, Teradata 14.0.

Confidential, Dayton, OH

ETL Informatica Developer

Responsibilities:

  • Strong HIPAA EDI 4010 and 5010 with ICD-9 and ICD-10, analysis & compliance experience from, Healthcare payers, providers and exchanges perspective, with primary focus on Coordination of benefits
  • Involved in all phases of SDLC (Software Development Life Cycle) including Requirement collection, Design and analysis of Customer specification, Development and Customization of the application.
  • Lead global teams, provided hands-on participation, technical guidance and leadership to separate data.
  • Re-architected DataStage jobs to maintain logical separation as de-merger progressed.
  • Resolved data issues in Cognos reports post data separation.
  • Extensive experience with Data Profiling.
  • Developed PowerShell scripting to automate the file transfers.
  • Defined Target Load Order Plan for loading Target when control table logic is used.
  • Configured the sessions using Workflow manager to have Multiple Partitions on Source data and to improve performance.
  • Worked on DataStage upgrade from version 8.1 to version 9.1.
  • Used ETL (SSIS) to develop jobs for extracting, cleaning, transforming and loading data into data warehouse.
  • Used DataStage stages namely Datasets, Sort, Lookup, Peek, Standardization, Row Generator stages, Remove Duplicates, Filter, External Filter, Aggregator, Funnel, Modify, and Column Export in accomplishing the ETL coding.
  • Managed migration of Data Center and DataStage Applications from SunGard to KDC domain.

Environment: Informatica Power Center 9.5.1/8.1, IBM Infosphere Quality stage SQL server 2008, Oracle, DB2, MicroStrategy, Toad, Unix, Informatica Versioning Tool, Tableau Desktop 7 & 8.1, Tableau 8.2/8.3/9.0, Tableau Server, Erwin 9.5, Spark, data stage Autosys,Powershell, EDL 834/837, Informatica Scheduler.

Confidential

ETL/Informatica Developer

Responsibilities:

  • Involved in development of Informatica mappings and also tuned for better performance.
  • Administered the Informatica repository by creating and managing user profiles and metadata
  • Created Informatica mappings with stored procedures to build business rules to load data.
  • Various transformations (Source qualifier, Aggregators, Connected &unconnected lookups, Filters &Sequence) were used to handle situations depending upon the requirement
  • Called stored procedures perform database operations on post-session and pre session commands
  • Written Parameter file for batch processing from different repositories.
  • Created partitions to concurrently load the data in to sources
  • Loaded bad data using reject loader utility
  • Involved in writing shell scripts, automating the batch jobs using crontabs.
  • Performed Unit Testing and tuned for better performance.
  • Written unix shell Scripts for getting data from all the systems to Data Warehousing system.
  • Built new dimensions in Universes to support the new reporting requirements of business users.
  • Used SQL tools like TOAD to run SQL queries and validate the data pulled in BO reports.
  • Created the reports using Business Objects functionality like Combined Queries, Slice and Dice, Drill Down, Functions, Cross Tab, Master Detail and Formulae etc.

Environment: Informatica PowerCenter 5.1, Business Objects 5.1, Oracle 8i, PL/SQL, SQL 2000, windows NT, Sun solaris 7.0, DB2

Hire Now