Sr. Informatica Developer Resume
Des Moines, IA
SUMMARY
- 7+ years of IT experience in Information Technology including Data Warehouse/Data Engineer/Data Mart/Data integration/Data Pipelines development using ETL/Informatica Power Center.
- Extensive experience in Data Engineering/Data Migration/Data Auditing/Data Modeling in Finance, Educational Management, Insurance, and Healthcare domains.
- Good exposure in overall SDLC including requirement gathering, development, testing, debugging, deployment, documentation, and production support.
- Power BI Expert with 1.5+ years of rich experience in creating compelling reports and dashboards using advanced DAX.
- Strong experience in the Analysis, design, development, testing, and implementation of Business Intelligence (BI) solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
- Strong Data Warehousing ETL experience of using Informatica Power Center 10.x, 9.6.1, 9.5.1 Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor, and Server tools Informatica Server, Repository Server manager.
- Expertise in Data Warehouse/Data mart, ODS, OLTP, and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation, and production support.
- Experience in all phases of Data Warehouse development which includes requirement gathering for the data warehouse to develop the code, Unit Testing, and Documenting.
- Extensive testing ETL experience using Informatica 10.2/9.6.1/9.5.1 (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor, and Server Manager) Teradata and Business Objects.
- Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Identifying Facts and Dimensions, Physical and logical data modeling using Erwin and ER-Studio.
- Expertise in working with relational databases such as Oracle 18C/12C/11g, SQL Server 2008/2005, DB2 8.0/7.0, UDB, MS Access, and Teradata.
- Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manager), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
- Experience in Azure Big Data Technologies like Azure Data Lake, HDInsight’s.
- Designed and Developed Data Extraction and Processing procedures to include information that is relevant for building analytic systems using HDInsight and Spark on Azure Data Platform.
- Provide guidance around database design for migration from on-premises to Azure.
- Data Extraction from Source systems using ingestion techniques - Azure data factory or Informatica.
- Experience using the SAS ETL tool, Talend ETL tool and SAS Enterprise Data Integration Server highly preferred.
- Extensive experience in developing Stored Procedures, Functions, Views, and Triggers, Complex SQL queries using SQL Server, TSQL, and Oracle PL/SQL.
- Experience in resolving on-going maintenance issues and bug fixes, monitoring Informatica sessions as well as performance tuning of mappings and sessions.
- Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
- Proficient in the Integration of various data sources with multiple relational databases like 18C/12C/11g, MS SQL Server, DB2, Teradata, VSAM files, and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
- Experience in using Automation Scheduling tools like Autosys, Control-M and Tidal.
- Implemented Slowly Changing Dimensions (SCD) Type 1 and Type 2 to update the dimensional schema.
- Raised change requests and incident Management, Further, Analyzed and Coordinated resolution of program flaws for the Development environment and hot fixed them in the QA, Pre-Prod and prod environments, during the runs using JIRA ticketing system.
- Used JIRA for issue tracking, User Story management and Requirement Traceability. Also used for defining the scope of development work by prioritizing based on deadlines and level of effort.
- Excellent interpersonal and communication skills, and I’m experienced in working with senior-level managers, business people, and developers across multiple disciplines.
TECHNICAL SKILLS
Tools: AWS Cloud Services such as S3, EC2, ELB, Redshift, Lambda, Azure BOT, Azure Data Bricks, Azure Blockchain, Informatica, MS SQL Server, Teradata SQL Assistant, Oracle SQL Developer, Toad, OBIEE, Tableau Control M
Schemas: Star Schema, Snowflake Schema.
Other Areas of Expertise: RDS, Data Migration Services, DynamoDB, Glue, SQS, SNS, Athena, EMR, Big Data Analytics, Data Modelling,, Data Mapping, Data warehouse Development, Data Analysis, OBIEE Reporting, IDQ (9.6,9.1), Data Standardization, Error and Audit Controls, Matching Logic Concepts, DB Level Performance tuning concepts.
Databases/RDBMS: MSSQL, Oracle, Teradata, DB2, T-SQL
Scripting: UNIX shells/scripts, PowerShell, Java Script, Perl, Python
Automation Languages: Java, Tidal.
Operating Systems, Databases: Windows, Android, Linux, Mac, SQL Server, Oracle SQL Developer, MySQL
Ticketing: Jira.
PROFESSIONAL EXPERIENCE
Confidential, Des Moines, IA
Sr. Informatica Developer
Responsibilities:
- Analyzed the business's database storage and warehousing capabilities and assessed the company's data requirements.
- Reviewed the business's database storage and warehousing capabilities and assessed the company's data requirements.
- Worked Expertly across multiple programs like Data Analytics, AWS Cloud Services, ETL, BI Reporting, Data Migration, Conversion, Data Modelling, Data Visualization.
- For better Data Integration used AWS Cloud Services such as S3, Redshift, RDS, Data Migration Services, DynamoDB, Lambda, Glue, SQS, SNS, Athena, ELB, EC2, and EMR.
- Exposure to Python and Big Data Analytics.
- Working Knowledge on BI Tools for Reporting such as Tableau and OBIEE.
- As a Sr. ETL Developer/Data Engineer I in the current project used ETL methodologies and BI Reporting capabilities on Data warehousing for the Business/Sales Team.
- Strong background experience in Data Warehousing, Business Intelligence and, ETL process.
- Worked on Data Integration and Data conversion projects.
- Migrated more than 300+ ETL Jobs from Legacy scheduler to Control M Platform.
- Experience data processing like collecting, aggregating, moving from various sources using Apache Flume and Kafka.
- Worked on a POC for extracting the real time data using Kafka and Spark later stream them by Creating DStreams and then converting them into RDD, processes it and store it into Cassandra.
- Been held Responsible for end-to-end data flow for both technical and functional aspects of the projects (which includes Change controls, performance improvement) In further, identify the potential issues, and assist them in overcoming roadblocks.
- Involved in designing mapping documents and Functional code-level changes.
- Designed and developed complex ETL Logic and Business OBIEE Reports.
- Complete understanding of regular matching, fuzzy logic, and dedupe limitations on IDQ suite.
- Heavily involved in the root cause analysis for Production issues and Code fix. Exposure to IDQ.
- Gathered requirements from Businesses and coordinated with the Data Architect. In addition, I, performed Data Cleansing activities using various IDQ transformations.
- Extensive experience in SDLC using Agile methodology.
- Expertise in Estimating and planning of development work using Agile Software Development.
- Awarded by the Account for Distinctive Achievement during the final evaluations completed by Talent Lead Requirement gathering from Data Architects, Client Teams, Business Users, and developing the mapping documents.
- Technical hands-on work which Includes Code development, testing, performance improvement for the end to end data flow.
- Worked on Databases as Source: Teradata, SQL and Oracle using Flat Files.
- Expertly worked on Data warehouse development such as Developing new ETL Applications, maintaining and Improving the performance of the Existing applications in Production.
- Performed Data Analysis as per the current Business Logic end to end and suggested code level changes to make the Data / Reports as expected by Business. Involved in Bi-weekly meetings with Architect & Business Team for the same.
- Imported data from SQL Server DB, Azure SQL DB to Power BI to generate reports. Additionally, Created Dax Queries to generate computed columns in Power BI.
- Published Power BI Reports in the required originations and Made Power BI Dashboards available in Web clients and mobile apps.
- Developed Power BI model used for financial reporting of P & L and Headcount.
- Implemented Error Handling and Audit Steps.
- Worked on Performance Improvement methodologies which include Table partitions at DB level, Informatica Partitioning at session-level parallel partitioning, and using different partitioning approaches like Hash Partitioning, Round Robin Partitioning, Pass Through Partitioning. Applying Indexes at Table level, using Parallel Hints to optimize the query run time.
Skills: IDQ (9.6,9.1), AWS, S3, Redshift, RDS, Data Migration Services, DynamoDB, Lambda, Glue, SQS, SNS, Athena, ELB, EC2, EMR, Power BI.
Confidential
Sr. Informatica Developer
Responsibilities:
- Created Data Maps / Extraction groups in Power Exchange Navigator for the Legacy IMS Parent sources.
- Staged Data from legacy Proclaim IMS system into Oracle 12C Master Tables Performed CDC capture registrations.
- Assisted in building ETL from source to Target specification documents by understanding the business requirements.
- About 1+ year of experience in Master Data Management (MDM) Multi Domain Edition HUB Console Configurations, Informatica Data Director (IDD) Application creations.
- Extensive experience in Master Data Management (MDM) HUB Console Configurations such as Staging Process Configurations, Landing Process Configurations, Match and Merge Process, Cleansing Functions, User Exits.
- Implemented MDM configurations for Party, Vendor, Product and Customer.
- Used Informatica MDM 10.1 (Siperian) tool to manage Master data of EDW.
- Extracted consolidated golden records from MDM base objects and loaded them into downstream applications.
- Developed mappings that perform Extraction, Transformation and Load of source data into Derived Masters schema using various power center transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, lookup, Rank, Joiner, Expression, Stored Procedure, SQL, Normalizer and update strategy to meet the business logic in mappings.
- Created mapplets, common functions, reusable transformations, look-ups for better usability.
- Performed Performance tuning at the Mapping level as well as the Database level to increase the data throughput.
- Designed the Process Control Table that would maintain the status of all the CDC jobs and thereby drive a load of Derived Master Tables.
- Used Teradata utilities like BTEQ, fast load, fast export, multi-load for data conversion.
- Created Post UNIX scripts to perform operations like gun zip, remove and touch files.
Skills: Integrated Management Systems (IMS), Siperian (10.0,9.0.1), PowerCenter (MDM), CDC, Teradata BTEQ, Fast Load, Fast Export, Multi-Load, Unix Scripts, Gunzip.
Confidential
Sr. Informatica Developer
Responsibilities:
- Utilized Informatica power center 9.6.1 in extracting the data from Insurance application AS400 to transform and then load the data into the enterprise data warehouse & data mart.
- Involved in direct conversations with business users and gathered the requirements to how to handle the source data and design accordingly the ETL process.
- Involved in Information administration which includes creating new users and groups, backing up the repository and domain as well as handling various upgrades.
- Developed logical data models and physical data models with an experience in forward and reverse engineering using Erwin.
- Used Informatica B2B data exchange to handle EDI (electronic data exchange) for handling the payments at the scheduled dates.
- Used major components like parsers, mappers and streamers in data transformation studio for conversion of XML files to other formats.
- Extensively worked with XML files as the Source and Target, used transformations like XML Generator and XML Parser to transform XML files, used Oracle XMLTYPE data type to store XML files.
- Implemented various data modeling schemas such as star schema and snowflake schema.
- Used Informatica features to implement type II changes in slowly changing dimension tables.
- Involved in the development of Informatica mappings and tuned session by increasing block size for better performance.
- Used different MDM concept for data transformation, normalization, error -detection and error-correction.
- Implemented various process of data cleanup and data validation using Informatica data quality.
Skills: Informatica Power Center 9.x/8.x, Informatica Power Exchange 9.x/8.x, IDQ 9.6/9.1 BI Tools OBIEE (10G, 11G), Tableau. Scheduling Tools Control-M, Autosys Languages C, Java, C++, Python, R. RDBMS Oracle/ Exadata/11g/10g, DB2V9, Teradata14/13, PL/SQL, SQL Server 2008 Tools/Utilities Toad, Perl, UNIX Shell, Business Objects XI R1 Operating Systems Windows, UNIX, Linux Web Technologies HTML, CSS, XML, JAVA SCRIPT.
Confidential
Informatica Developer
Responsibilities:
- Designed and developed complex Informatica mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics while coding a mapping.
- Built Informatica reusable Transformations and Mapplets wherever redundancy is needed.
- Involved in creating Mapplets using Java Transformations to Decrypt the Encrypted data.
- Extensively worked on the transformations like Aggregator - to get monthly, quarterly, yearly invoicing also used SQL override to filter the data.
- Analysis of our source data systems using the IDQ basic profiling features using the Informatica developer and the analyst tools before making any ETL Designs.
- Created financial accounting Rewards and Rebates dimension table mappings.
- Involved in Testing, Debugging, Validation and Performance Tuning of data warehouse, help develop ptimum solutions for data warehouse deliverables.
- Used Debugger to test the logic been implemented in the mappings.
- Developed mapping to load the data in slowly changing dimensions (SCD).
- Build the ETL source to Target specification documents by understanding the business requirements. Automated the production jobs using UNIX Shell scripting and worked to implement automated ETL solutions which include scheduling of jobs with Informatica infa cmd Commands to remove manual dependencies.
- Identified areas of improvement, developed UNIX automation scripts to reduce manual effort by 40%, from10hrs to 6 hrs.
- Involved in Data Quality Analysis to determine the cleansing requirements.
- Created Profiles and Scorecards and performed join analysis in IDQ.
- Experience in Informatica Data Quality transformations like Address Doctor, Parser, Labeler, Match, Exception, Merger and Standardizer.
- Created Oracle Stored Procedure to implement complex business logic for better performance.
- Tuned the performance of Informatica session for large data files by increasing block size, data cache size and target based on commit interval.
- Also Involved in tuning the mappings in the transformations by tracking the reader, writer, transformation threads in the session logs and used tracing level to verbose during development & only with very small data sets.
- Involved in Unit testing, System testing to check whether the data loads into target are accurate.
- Worked on troubleshooting issues and quickly resolving them in an efficient manner that minimizes downtime.
- Worked with flat files in both direct and indirect methods.
- Worked on MicroStrategy reporting tool to generate reports for business from DWH reporting tables.
- Automated the production jobs using UNIX Shell scripting and worked to implement automated ETL.
- Expert usage of Informatica Scheduler along with Tidal for scheduling the workflows. In further, involved in creating the jobs in Tidal Scheduler tool.
Environment: Informatica power center 9.5.1, IDQ, Oracle, DB2, Netezza, Control M, Java, Tidal Scheduler, MicroStrategy and UNIX.
Confidential
Informatica Develope
Responsibilities:
- Developed mappings and mapplets using Informatica Designer to load data into ODS from various transactional source systems.
- Used Informatica Designer to import the sources, targets, create various transformations and mappings for extracting, transforming and loading operational data into the EDW from ODS.
- Used various transformations such as expression, filter, rank, source qualifier, joiner, aggregator and Normalizer in the mappings and applied surrogate keys on target table.
- Created mapplets and reusable transformations.
- Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.
- Expert use of tasks such as Email task to deliver reports generated to the mailboxes and command tasks to Further, write Post session and Pre-session commands.
- Created stored procedures, Functions and Packages in Oracle, used PL/SQL and SQL. Used SQL loader to load data from MS-EXCEL sheets.
- Created connection pools, physical tables, defined joins and implemented authorizations in the physical layer of the repository.
- Migrated mappings from Development to Testing and performed Unit Testing and Integration Testing.
Environment: Informatica Power Center 8.6/9.5.1, Repository Manager, SQL*loader, Oracle 10g, PL/SQL, SQL, UNIX, Win 2000/NT.