- ETL Informatica developer with about 6 years’ experience.
- Experience in data integration using Informatica power center in designing and developing complex mappings, mapplets, transformations, workflows, worklets, scheduling workflows and sessions.
- Knowledge of all aspects of SDLC that includes requirements gathering, analysis, design, and development.
- Proficient loading and extracting data from Confidential databases and SQL querying.
- Prepping for the AWS Solutions Architect Associate and have extensive hands - on exercises on the AWS Cloud infrastructure: (Elastic Compute Cloud (EC2), Application Load Balancer, Auto Scaling Groups, Elastic File System, S3 and CloudFront, Databases on AWS (RDS), Route 53, Virtual Private Cloud (VPC))
- Expertise in Tableau helped me successfully build the visualizations, dashboards, stories connecting to an Excel & Confidential 11g database using live and extract data, using joins, heat maps, graphs, bins, packed bubbles, publishing to server, table calculations, geo coding, parameters, lines.
- Good at analyzing the Functional and Technical Specifications of the project and capturing the business requirement gaps.
- Excellent ability to multi-task, analyze and solve problem effectively, quickly grasp new technologies and tools and ability to work both in team as well as an individual make me an asset to the Team and organization.
- Good written and verbal communication skill and ability to work with minimal supervision.
- Green Card/ Permanent resident of USA. Do not need any current or future work visa sponsorship.
ETL Tools: Informatica Power Center 10.2/9.6.1/9.5.1
RDBMS: Confidential 12c/11g, MS Access
QA Tools: QC, JIRA
Reporting Tools: Tableau 2018/ 2019, Excel
Languages: Python, SQL, PLSQL, VBA
Operating Systems: Windows, Unix
- Created complex ETL Mappings to load data using transformations like source qualifier, sorter, aggregator, expression, joiner, and connected and unconnected lookups, filters, sequence, router, and update strategy.
- Analyzed requirements from the users and created, reviewed the specifications for ETL.
- Designed Incremental strategy, created reusable transformations, mapplets, Mappings/Sessions/Workflows etc.
- Used Session parameters, mapping variable/parameters and created parameter files for imparting flexible runs of workflows based on changing variable values.
- Worked on SQL and PL/SQL to extract, process and load data for ETL jobs developed with the combination of Perl and Shell scripting and production issues, failure of ETL jobs, to load data to data warehouse and ODS and data issues related to reports.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
- Created PL/SQL stored procedures, functions, and packages for moving the data from staging area to data mart.
- Extensively used ETL methodology for supporting Data Extraction, transformations and Loading process.
- Reviewed and analyzed the JIRA Stories and worked with Test Manager to comprehend and analyze functional requirements.
- Developed Test Scenarios for Unit testing to ensure that all the ETL jobs and website functionalities work as per the Business Team requirements.
- Actively participated in Daily AGILE status meetings to keep the team informed of the progress of the testing and discuss any blocks or issues.
- Used JIRA to enter, track and close the defects.
- Actively participated in Defect Review Meetings with the team members
- Involved in System Integration testing
- Performed data analysis using Python Pandas.
- Responsible for Extracting, Transforming, and Loading data from Confidential, flat files, and placing them into targets.
- Carry out the Regression testing to ensure that no existing functionalities have broken due to job enhancements.
- Developed various mappings using Mapping Designer and worked with Source qualifier, aggregator, connected unconnected lookups, Filter transformation, and sequence generator transformations.
- Involved in Data Modeling and design of Data Warehouse and Data Marts in Star Schema methodology with conformed and granular dimensions and fact tables.
- Used SQL, PL/SQL for database related functionality.
- Performed complex defect fixes in various environments like UAT, SIT, etc. to ensure the proper delivery of the developed jobs into the production environment.
- Responsible to write complex SQL queries and PL/SQL procedures to perform database operations according to business requirements.
- Worked with the testers closely in determining both medium and high severity defects, that would potentially affect the downstream systems before the release of the project and fixed the defects before moving the jobs into production.
- Design and deploy rich Graphic visualizations with drill down and dropdown menu option and parameters using Tableau.
- Using advanced Excel features like Pivot tables and Charts for generating Graphs.
- Designed and developed weekly, monthly reports by using MS Excel Techniques (Charts, Graphs, Pivot tables) and PowerPoint presentations.
- Strong Excel skills, including pivots, VLOOKUP, conditional formatting, large record sets. Including data manipulation and cleaning.
- Preparing Dashboards using calculations, parameters in Tableau.
- Created and Managed Reserved EC2 Instance and RDS Resources.
- Created S3 buckets to store business objects and added bucket policies and permissions to restrict access to required users only.
- Worked in resolving production issues providing apt solutions with tight deadlines and SLA's.
- Prepared and reviewed SOX process flows and risk & control matrices.
- Assist with monitoring and occasional design and redesign of accounting and reporting processes to ensure compliance with SOX
- Manage all US banking activities/ treasury including deposits, wires, check runs, short-term investments, corporate AMEX credit card program,
- Prepared weekly cash flow statement after consultation with finance and field.