Sr. Etl Engineer Resume
SUMMARY
- Over the past 10 years, Hari has excelled as an IT specialist with expertise in Software development life cycle (SDLC), Agile methodologies and ETL - related activities by using a wide variety of information, noledge, and tools to develop, modify, and administer databases used to store and retrieve data, insuring the security of data, developed contingency plans to mitigate, and minimize risks associated with IT systems vulnerabilities and to develop standards for the handling of data.
- Hari has strong management skills and leadership experience with a reputation for meeting the challenging organizational goals and objectives and he has led efforts to organize, analyze and provide strategies on highly complex problems.
- Hari’s professional goal is to contribute and continue to lead at highest level, make positive impact through product development leveraging ETL technology and creative problem solving. He is intellectually curious individual with solutions-oriented attitude who enjoys learning new tools and techniques.
- With experience in diverse workplace environment, he is able to manage teams and programs to achieve goals, objectives, strategies and solutions.
- Displays high standards of ethical conduct, possess excellent interpersonal skills in team management, commitment to quality, reliability, and teamwork.
- Managed resources and developed work plan in order to accomplish the overall goal of the project and mission.
- Coordinated with teams across the departments in order to get the best outcome.
- Managed project by conducting planning, scheduling, tracking status, reviewing and reporting status.
- Experience in identifying the business needs of clients and stakeholders to halp determine solutions to business problems.
- Identifying and analyzing IT software and system issues.
- Suggesting appropriate technology-based solutions for enhancing functional efficiency of the organization and achieving business excellence.
- Experience in analyzing, interpreting, explaining and providing technical guidance regarding problem resolutions.
- Worked with management, customers and support staff across domain to resolve technical issues, identify and diagnose problems.
- Demonstrated attention to customer’s needs by conducting and participating in meetings for successful deliverables.
- Interacted with professional, accurate, organized, clear and concise communication dat is adjusted and appropriate to the participants - customers, peers and management.
- Conveyed messages to the listeners in a manner dat is easily understood while being courteous and polite also being open to discussions, being receptive and supportive to new ideas and/or suggestions.
- Produced written information dat is appropriate for the intended audience by following procedures and guidelines.
PROFESSIONAL EXPERIENCE
Confidential
Sr. ETL Engineer
Responsibilities:
- Actively involved in the successful design, development, and delivery of HR Modernization effort of Confidential (TSA) under the Recruit and Hire (R&H) Candidate Cloud Project in Microsoft Azure. The project provides a solution dat automates many manual tasks and reduces the number of disparate systems where all data will flow into and out of the Azure data platform, storing both transactional and analytical data.
- Engaged with the Business Analysts, Client team, and Development/integration team for system integration between Salesforce, the Azure Government Cloud, External Systems like Monster and USA Job, and the TSA On-Premise Data Center so dat overall system functionality through Agile process.
- Involved in Multidimensional modeling and implemented both forward and reverse engineering and using ERWIN designed and implemented logical and physical data modeling for Vacancy, Candidate, Application, Assessment, Job Offer, Certificate, Organization, and SF52 (Request for Personnel Action/reason for resignation/retirement) datasets.
- Using T-SQL and SQL: Designed, coded, and implemented Tables, Constraints, Indexes, Table Types, External Table, External Data Sources, Views, Stored Procedures, Functions, Triggers, Cursors, Complex Join and Sub Queries, CTE, Temp Tables and Table Variables for Candidate Cloud Project in Microsoft Azure.
- Using T-SQL and SQL: Designed, coded, and implemented Stored procedures, functions, and views using JSON format dat pulls data from Salesforce to Azure Cloud Database for Candidate Cloud Project in Microsoft Azure.
- Designed Master Logic Apps and Child Logic Apps, Integrated the Child Logic Apps and Master Logic Apps to pull/push data in JSON format between Salesforce and DC1 for different datasets.
- Designed Stored Procedure dat accepts JSON Messages for Candidate and Location datasets and pushing these datasets to Salesforce. Furthermore, created Azure Logic Apps to call Stored procedures and push them to Salesforce.
- Using Azure Resource Manager (ARM): Deployed Azure Data Factory Pipelines from Dev to Test and Prod.
- Designed and implemented Stored procedures dat accept JSON Message and loaded into different tables.
- Developed Extract, transform, and load (ETL) processes in Azure SQL Data warehouse using SSIS, Azure Data Factory, Azure SQL Server, Azure SQL Database, Azure Logic Apps, Azure Storage, Azure Data Lake Store and Analytics, Event Grid. Event Hubs and Azure Stream Analytics Jobs, Azcopy, and Others.
- Designed and implemented SSIS for data migration from legacy on-premises source systems like MSAccess, Oracle to Azure SQL Databases (Data Profiling, Data Cleaning, Data Transformation, Data Mapping, and Data Migration) for Candidates, Assessments, Applications, FAMS, and Certificate datasets. This included Schema Migration, Data Migration, and Data Validation for Human Resource Capital Migration.
- Processed Documents and Delta Dump files using oracle data pump utilities and moved tables and schema to Oracle database first and then to Azure SQL Databases and Azure Storage through the ADF pipeline. Moreover, created PL/SQL Stored procedures, functions, and view to automate those files' delta load process.
- Designed and implemented Azure Data Factory Pipelines dat pull data from Salesforce to Azure SQL Databases for HR Application, Candidate, Assessment, Accounts, JOA, and Job Offers datasets.
- Designed and implemented Azure Data Factory Pipeline dat pushes data from Azure SQL Databases to Salesforce for HR Application, Candidate, Assessment, and Accounts Objects.
- Using Azure Data Factory designed a Pipeline dat transfers files from Azure Blob Storage to DC1 SFTP and automated the process.
- Designed and implemented multiple Stored procedures dat extract datasets for SF52, Assessments, and Accounts in Azure SQL Databases and exported to DC1 Systems in JSON format.
- Designed and implemented Stored procedure dat calculates a complex composite score for the vet and non-vet candidate. Furthermore, designed ADF Pipeline pushing the composite score datasets from Staging to CC databases and then to the Salesforce system.
- Participated in code/schema reviews throughout the software development process, track versions of the source code, and moved ETL code between environments.
- Participated in developing data dictionaries, data models, metadata repositories, and other data management tools to ensure compliance with data management standards.
- Gathered business and technical requirements for the data warehouse translated the user inputs into ETL design docs by preparing S2T documents and other technical specification documents for each enhancement and maintenance release.
- Participated in the entire software development lifecycle by writing and executing test plans, finding solutions to issues during development and after deployment.
- Administered Windows/Azure Instances: Installed and configured Windows Active Directory Active Directory Domain Services for objects management (users, groups, network printers, network guests' nodes) and Active Directory Lightweight Active Directory Services for deployment of applications.
- Installed, configured, upgraded and managed tasks related to system and server administration, to include software applications and user authentication, VMware Vsphere, hypervisor version 5.x/6.x technologies.
- Established connection from Azure to On-premise datacenter using Azure Express Route for Single and Multi-subscription connectivity.
- Troubleshoot and solved production problems relating to the Azure Active Directory, Active Directory Federation Services (ADFS), Microsoft software or services such as SQL Server, System Center Operations Manager (SCOM), and IIS, and cloud computing services such as Software As A Services (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS).
- Converted multiple onsite Data Centers to Virtual Infrastructure using Microsoft Hyper-V reducing physical server footprint, while improving management, efficiency and performance.
- Installed and upgraded VMware Tools for Virtual machine. Creating Clusters for High Availability (HA) and Distributed Resource Schedule and also created Azure VM, Cloud Services and Storages by using Azure Power-Shell Script.
- Project Accomplishments: Implemented the Final TSA approved application for the new Candidate Portal Recruitment and Hiring by successfully migrating Leidos legacy data to Azure, and integrating Azure Cloud Database for Candidate Cloud Project in Microsoft Azure and from Salesforce.
Project Environment: SSIS, Azure, SQL Server, T SQL, Oracle 11g, PL/SQL, Salesforce, SharePoint, UNIX, WinSCP, MS Visio, Erwin, JIRA, TFS, .Net JSON, Power BI.
Confidential
Sr. ETL Developer
Responsibilities:
- Worked on Massively Parallel Processing, Operational Data Store with large-scale using big Data Greenplum database to house key data needed for scoring entities based on sophisticated business rules and tax fraud analytic models.
- Participated in reviewing the Return Review Program (RRP) business and functional requirements to halp identify tax return fraud instances, potential improvements in the IRS approach to prevent, detect, and resolve pre-refund tax fraud from the ETL perspective.
- Developed Extract, transform, and load (ETL) code to load large amounts of data from other IRS internal data sources to the RRP system fulfilling stringent performance requirements.
- Using Informatica PowerCenter 10.2 and Developers Tool (10.2): Integration Platform as a Service (iPaaS): Created ETL Mappings, Mapplets, Workflows, Worklets both in production and development environments and prepared corresponding documentation. Also involved in migration/conversion of ETL processes from development to production.
- Administered the repository by creating folders and logins for the group members and assigning necessary privileges.
- Designed and developed ETL processes, translated ETL logic to process a large number of files (mostly CSV, XML, COBOL) from multiple source systems containing massive volume data and loaded into staging tables and then to the target database and modifying the schema XSDS based on the changing requirements when necessary.
- Created Complex mappings by defining the transformation logic according to the business rules and using Connected and Unconnected Lookups, Aggregate, Update Strategy, Stored Procedure, Java, Normalizer and Router transformations for efficiently populating target table.
- Involved in the production support, observe the workflow monitor, closely watched for completion messages and status reports (hourly/daily basis) to see all workflows and all Tasks were running correctly, and data was loading in the proper way and to the targets and at the right time, and analyzed and found resolution or submitted Incident Reports of all data related production issues.
- Involved in debugging and performance tuning of ETL bottlenecks at various levels like mappings, sessions, workflows, transformations, targets, sources, databases, and recommended performance enhancements indexing, partitioning, increasing storage capacity, or modifying interfaces.
- Using SQL and PL/SQL/PostgreSQL in Greenplum Database: Developed Complex database objects like Stored Procedures, Functions, Packages, and Triggers PL/SQL.
- Extensively worked on Views, Stored Procedures, Triggers, and SQL queries and loading the data (staging) to enhance and maintain the existing functionality.
- Performed a wide range of database administration functions, including running test queries, troubleshooting database problems, maintaining version control of database entities, advising customers on new database features, and leading studies to evaluate the TEMPeffectiveness of current database methods and procedures.
- Using UNIX shell scripting: Created and maintain the parameter files, execute database procedures, call Informatica workflows, and job monitoring and scheduling.
- Performed reverse-engineering from Informatica codes, UNIX shell scripts, PostgreSQL/Greenplum SQL scripts, Stored procedures, metadata to understand the existing process, data, and relationships.
- Using workload automation tool- Control-M: monitored hourly jobs in production environments and troubleshot or created alert if job failed.
- Created unit test cases to test data loads and checked whether the components adhered to the technical design and fixed or reported defects.
- Performed data validation, identifying patterns and trends in data sets, wrote Test Cases, Executed the Test Scripts, Creating System Test Reports, Reporting on Defects and Incident Reports.
- Created incident tickets, enhancements and/or problem tickets in the ticket tracking system and communicates TEMPeffectively with development and internal business operations teams.
- Managed the lifecycle of code development, comply with (Software Development Life Cycle) SDLC from ideation to sprints to deployment by implementation agile methodology.
- Used CLM, Clear case (Check-out, Check-in) and TFS to virtually stories, and tasks.
Project Environment: Informatica Power Center 9.6, Informatica PowerExchange 9.6, BigData Developer, PostgreSQL, Greenplum, SQL Server 12/14, Oracle 11g/12c, DB2, TOAD, Erwin, PL/SQL Developer, Peoplesoft, Autosys, Hive, Hue, XML, Shell Scripting, Putty, WinSCP.
Confidential
Sr. ETL Developer
Responsibilities:
- Involved as a primary on-site ETL developer and coordinated with other offshore teams during the analysis, planning, design, development, and implementation stages of various projects using Informatica Power Center (9.6), SSIS (2012/14), SQL Server, Oracle, Visio, and SSRS.
- Involved in the administration of databases, modifying data elements, retrieval and reporting of information from the databases, and ensuring data security.
- Developed, implemented, and maintained database back-up and recovery procedures and ensure data integrity, security, recoverability; also developed data dictionaries, data models, metadata repositories; and implemented Change Data Capture (CDC) to simplify ETL in data warehouse applications.
- Using Microfot Azure platform: Established connection from Azure to On-premise datacenter using Azure Express Route for Single and Multi-subscription connectivity.
- Converted multiple onsite Data Centers to Virtual Infrastructure using Microsoft Hyper-V reducing physical server footprint, while improving management, efficiency and performance.
- Developed, implemented, and maintained database back-up and recovery procedures and ensure data integrity, security, recoverability, also developed data dictionaries, data models, metadata repositories, and implemented Change Data Capture (CDC) to simplify ETL in data warehouse applications.
- Using Informatica Power Center (9.6), Informatica Cloud: Developed mappings to load the data from various sources like flat files, CSV, XML, XSD, Excel, MS Access, DB2, Salesforce etc. and transformed the data based on user requirement into the Data warehouse, scheduled the sessions, created sequential batches and concurrent batches for sessions and performance tuning of mappings, workflows, and sessions for better performance.
- Worked on Power Center tool - Source Analyzer, Data Warehousing designer, Mapping & Mapplet Designer, and Transformation Designer, created reusable mapplets, and worked on various kinds of transformations like Expression, Aggregator, Stored Procedure, Java, Lookup, Filter, Joiner, Rank, Router and Update Strategy.
- Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter, and Aggregator transformations.
- Involved in the migration of Informatica objects in all phases (dev, int, & prod) of the project and trained developers to maintain the System when in production.
- Monitored the Informatica server environment on UNIX server regularly.
- Using Integration Services(SSIS), SQL Server Analysis Services (SSAS) andSQL ServerReporting Services(SSRS) (2012/14): Designed SSIS packages to import and transform data between databases and external data sources using various transformations like Data conversion, File system task, Row counts, OLE DB Source, OLE DB Destination, OLE DB Command, Merge Join, Lookup, Sort, Data flow task.
- Created SSIS Packages to extract data from Excel, CSV, MS Access files using Pivot Transformation, Fuzzy Lookup, Derived Columns, Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, Execute Package Task, etc. to generate underlying data for the business reports.
- Created a report model on SSAS cubes.
- Wrote function, Store procedures for processing business logic in the SSIS Package and database. Tuning of SQL queries for better performance.
- Involved in developing automated Stored procedures to use in SSIS packages which were useful to load the dimension and fact tables based on the table type.
- Used SQL Server as database and design various Stored procedures using SQL Server and MDX Queries to generate Reports Using SSRS/SSAS.
- Debugged SSIS packages, Performance tuning on Slowly running SSIS Packages.
- Using PowerBI: Formatted dashboards, multiple chart types, trends, and KPIs with Power View to support analyses.
- Translated the business requirements into workable functional and non-functional requirements at detailed production level using Workflow Diagrams, Sequence Diagrams, Activity Diagrams and Use Case Modeling.
- Utilized Power BI (Power View) to create various analytical dashboards dat depicts critical KPIs such as legal case matter, billing hours and case proceedings along with slicers and dicers enabling end-user to make filters.
- Created reports utilizing SSRS, Excel services, Power BI and deployed them on SharePoint Server as per business requirements.
- Using SQL, T-SQL, and PL/SQL: Designed, coded, and implemented Tables, Constraints, Indexes, Table Types, External Table, External Data Sources, Views, Stored Procedures, Functions, Triggers, Cursors, Complex Join and Sub Queries, CTE, Temp Tables and Table Variables for Candidate Cloud Project in Microsoft Azure.
- Used Configuration Management (SVN) to Store all the automated scripts and regular insert scripts to run automatically after every data model deployment and widely used JIRA Dashboard to track the issues and communicate with the team members.
- Involved in writing shell scripts to automate the jobs which were used in the TIDAL.
- Used TIDAL and $U auto-scheduling tool for automating the process.
- Participated on committees, task teams, and other groups as needed, providing technical expertise and support for evaluating and recommending data management products, documented results, and related presentations as necessary.
- Managed ETL team ranging in size from 6-10 people.
Project Environment: Informatica 9.6, PL/SQL Developer, SharePoint, UNIX, Oracle 11g, SQL Server, SSIS, Putty, WinSCP, Tial, MS Visio; JIRA, SVN.
Confidential
Sr. ETL Developer
Responsibilities:
- Developed, implemented database objects for online and Data warehouse business process using MSBI (SSIS/SSAS) stack and Informatica PowerCenter and /PowerExchnage 9.6, SQL Server and EDI tools in Windows environment and ensure the success of the project by taking ownership of deliverables and assist/perform analysis, testing or other tasks as necessary.
- Using MSBI: Created SSIS packages for data extraction from Flat Files, Excel Files, and OLEDB to SQL Server using various transformations like Slowly Changing Dimensions, Lookup, Aggregate, Derived Column, Conditional Split, Fuzzy Lookup, Multicast, and Data Conversion. Merge, Union ALL.
- Worked with Meta-Data Driven SSIS Packages to pull the data from different sources and load it to Datamart.
- Configured SSIS packages using Package configuration wizard to allow packages to run on different environments and deploying the SSIS Packages from development server to production server.
- Extensively involved in performance tuning by determining bottlenecks at various points like targets, sources, mappings, and sessions.
- Created logging for ETL load at package level and task level to log number of records processed by each package and each task in a package using SSIS.
- Created indexes on selective columns to speed up queries and analyses in SQL Server Management Studio.
- Used Execution Plan, SQL Server Profiler to trace the slow running queries and optimized SQL queries for improved performance and availability.
- Developed MDX Scripts to create datasets to perform reporting and included interactive drill-down reports, report models, and dashboard reports.
- Created databases, tables, views, Stored procedures and triggers, User Defined Functions
- Using Informatica PowerCenter/PowerExchnage 9.6: Developed Informatica Change Data Capture (CDC) Mappings to Real-Time to process operational and transitional data from the sources systems into Data Warehouse and Data Marts (where separate provider, patient, billing, pharmacy, etc. details are Stored) in the PowerExchange environment using many transformations like Expression, Sequence Generator, Router, Update Strategy, Filter, and Router Transformations.
- Created Informatica mappings dat involved best practices, client naming conventions, Error Handling, and Responsible for Performance Tuning at the Source, Targets, Transformations, Mappings, and Sessions.
- Created folders, shared folders for shortcuts, groups, users and assigned permissions and privileges to groups and users using the Informatica Repository Manager.
- Created deployment groups for code migration in the Repository manager and coordinated with the production support team to migrate the deployment groups to Test and Production environments.
- Performed the performance and tuning at source, Target levels using Indexes, Hints, and Partitioning in SQL Server and Informatica.
- Created variables and parameters files for the mapping and Session to migrate easily in different environments and databases.
- HIPAA, EDI, and ICD Codes: Worked extensively on the HIPPA 4010 transactions and HIPPA 5010 transactions as the source data like 834,835, 277,276, EDI X12 Messages, and ICD 9, ICD 10 Codes.
- Participated in the initial Impact Analysis (ICD-10) for all the critical functions, me.e., Membership & Enrollment, Claims Processing, Utilization Management, Medical Policy, Reporting, Vendor Management, etc.
- Involved in analyzing different modules of facets system and EDI interfaces to understand the source system and source data and extensively used the reusable transformation, mappings, and codes using Mapplets for faster development and standardization and used reusable Session for a different level of workflows.
- Using Visio: Customized data models for Data Mart supporting data from multiple sources in real-time; developed logical and physical data models dat capture current state/future state data elements and data flows.
- Coordination/Team Management: Coordinated with business analysts for requirement gathering, business analysis and prepared high-level technical specification documentation and Informatica mapping document and coordinated with Source System Owners to analyze sources for data feeds.
- Involved in Unit Testing, Integration Testing, and User Acceptance Testing to verify the data loaded into target systems is accurate as per requirements by evaluating test scripts and Data validation rules.
- Created Remedy tickets and tasks for change requests/ code deployments and was extensively involved in team assessment meetings and technical peer reviews to receive the functional manager's approvals.
- Worked closely with Database administrators for implementing DDL changes and migration the changes.
- Managed teams ranging in size from 6-10 people.
Project Environment: Informatica Power Center 9.6, Informatica PowerExchange 9.6, SQL Server 12/14, Oracle 11g, Peoplesoft.
Confidential
ETL/Database Developer
Responsibilities:
- Using Informatica PowerCenter 9.6/9.1/8.6: Designed a robust end-to-end ETL process involving complex transformation like Source Qualifier, Lookup, Update Strategy, Router, Aggregator, Sequence Generator, Filter, Expression, Stored Procedure, External Procedure, Transactional Control for the efficient extraction, transformation, and loading of the data to the staging and then to the Data Mart (Data Warehouse) checking the complex logics for computing the facts.
- Monitored workflows and determining bottlenecks at various points like targets, sources, mappings and sessions to maximize the performance. And resolved memory related issues like DTM buffer size, cache size to optimize session runs and to process millions of input records with estimated throughput.
- Created several jobs in Control M to schedule Informatica workflows as per business requirements.
- Worked on coding, testing, and tuning Informatica Jobs and performed extensive Data analyzing and testing on reporting files coordinating with accounts team.
- Worked with Business analysts and the DBA for requirements gathering, business analysis, and designing of the data warehouse. Designed and developed transformation rules (business rules) to generate consolidated (fact/summary) data.
- Using Database (Postgres, Greenplum): Develop complex PL/SQL & SQL scripts using Oracle Toad & SQL Developer tools to validate the ETL process
- Involved in Performance Tuning of SQL Queries and handled daily load issues in Data Extracts.
- Involved in creating tables, table Partitions, Materialized views and Indexes and PL/SQL Stored procedures, functions, triggers, and packages.
- Involved in Unit testing and regression testing of tuned data extracts by coordinating with the offshore team.
- Done extensive bulk loading and normalize into the target using Greenplum Loader.
- Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.
- Used JIRA tools to ensure applications, products and/or releases comply with the Bank’s QA standards.
- Participated in overall systems integration testing, provided input and direction on the scripts' scope, and maintained testing data for SIT and UAT environments.
- Used Erwin/Toad for Logical and Physical database modeling of the warehouse, responsible for database schemas creation based on the logical models.
- Wrote complex SQL scripts to avoid Informatica Lookups to improve the performance as the data volume was heavy.
- Used PMCMD command to start, stop and ping the server from UNIX and created UNIX Shell scripts to automate the process.
- Worked with XSD and XML files generation through the ETL process.
- Performed the migration of ETL and database object to production according to the SDLC model.
- Understand financial products and how they are reported on by Risk, Product Control and Finance departments within an investment bank.
- Attended on-Call Production support, for every two weeks - 24/7.
Confidential
Software Developer/Programmer
Responsibilities:
- Worked with the client to gather requirements and participated in full life cycle development doing Design, Coding, Implementation, testing, debugging programs according to detailed requirements, maintenance of the System, and configured network workstations and peripherals.
- Participated in design, development, acceptance testing, and implementation of software programs and developed flow diagrams and/or pseudo-code.
- Provided advice and assistance concerning computer software and equipment specifications for the utilization and acquisition, resolved issues involving conflicting requirements, and recommended action involving complex problems analysis.
- Created users and assigned privileges, including System Privileges, Object Privileges, and user Roles, which targeted specific tasks required by the user.
- Deployed PL/SQL policies to State Department Servers worldwide, including Windows and Linux servers.
- Conducted Presentations to stakeholders regarding the functionality, development, and usage of database security.
- Documented the issues encountered during the System Development Life Cycle and Installation procedures, step to be used with Windows and Linux Server.
- Played a vital role in expanding code coverage- writing unit, functional and Integration tests for quality assurance.
- Determined testing requirements, developed and reviewed scripts for positive and negative test scenarios, conducted baseline testing, and generated reports: tested server and Web Portal.
- Reviewed application, System, security, and programming code for errors. Escalated issues and verified fixes, resolving 100+ bugs before product launch.
- Involved in developing user interface (UI) and checking validations using HTML/CSS, AJAX, and JavaScript.
- Documented step-by-step guide for new users of the application also implemented as a web page halp.
- Documented the issues encountered during the System Development Life Cycle and created a troubleshooting guide for non issues.
- Conduct integrated analysis of multiple audit logs (e.g., firewall, Web server) and monitor network and system performance; and troubleshoot problems as they arise.
- Collaborated with Business analysts and the DBA for requirements gathering, business analysis, and designing the data marts.
- Extensively used Informatica Client tools - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer, Informatica Repository Manager, and Workflow Manager.
- Developed various mappings using Aggregator, Lookup and Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter, and Sequence Generator.
- Created Complex Mappings which involved SCDs to implement Business Logic and capturing the deleted records in the source systems.
- Developed PL/SQL Stored procedures for database updates and created the necessary indexes in the target tables.
- Created PL/SQL packages and Stored procedures to load data from various sources to the staging area.
- Created and scheduled Sessions, Jobs based on demand, run on time, and run only once using Workflow Manager.
- Performed Unit testing, Integration testing, and System testing of Informatica mappings.
- Wrote UNIX shell scripts to work with flat files to define parameter files and create pre-and post-session commands.
- Used Debugger in troubleshooting the existing mappings.
TECHNICAL SKILLS
Data Warehousing & BI: SSIS (Integration Service), SSAS (Analysis Service), SSRS (Reporting Service), Informatica Power Center/Power Exchange 10.2/ 9.1/8.6/8.5 (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server), Informatica Cloud, Informatica B2B, data Transformation (B2B), Azure cloud platform (Azure Data Factory, Azure Data Lake, Azure SQL), XMLSPy, Eclipse, MS Visual Studio /2008/2010/2017, Erwin, Visio.
Cloud Platform: Azure, Informatica Cloud, Salesforce
Analysis & Reporting Tools: Excel, SSAS 2016/2008R2, SSRS, Power BI.
Databases: SQL Server 2016/2012/2008R2, Greenplum, PgAdmin, Oracle, DB2, MYSQL, MS-ACCESS, MS Excel.
Programming: SQL, MDX, TSQL.UNIX Shell Scripting, XML, HTML and CSS, Basics of C.
Scheduling tools: Control-M, Tivoli (TWS), $Universe
IT Processes: Software Development Life Cycle (SDLC), Project Management, System Analysts, Agile process.
Version Control: Visual SourceSafe (VSS), Team Foundation Server (TFS). JIRA, SVN, CLM, Clear Quest/ClearCase
Productivity Applications: MS Word, MS Excel, MS Access, MS Project, Visio, VSTF.
Operating System: VMware, Linux, Windows-98/2000/2003/XP/NT/Vista/2007 and Windows10, Linux.