We provide IT Staff Augmentation Services!

Senior Associate Resume

5.00/5 (Submit Your Rating)

SUMMARY

  • Overall, 9 plus years of experience in IT Industry with Data warehousing, OLAP reporting tools, ETL tools, Cloud platforming using industry best methodologies and procedures Including Data Governance, Data Integration and Data Quality Assurance with various clients in several business domain such as Insurance, Manufacturing, Education, Media and Health Care
  • Expert knowledge in working with Data Warehousing ETL using Informatica Power Center 10.x/9.x (Designer, Repository manager, Repository Server Administrator console, Server Manager, Work flow manager, workflow monitor)
  • Experienced in Snowflake, AWS and Azure cloud platform experience and having strong experience with Azure Data factory for data load
  • Experienced in Snowflake cloud data warehousing shared technology environment for providing stable infrastructure, architecture, best practices, secured environment, reusable generic frameworks, robust design architecture, technology expertise, best practices and automated SCBD (Secured Database Connections, Code Review, Build Process, Deployment Process) utilities
  • Involved in SDLC involving Application Development, Data Modeling, Business Data Analysis and ETL/OLAP Processes, business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing
  • Experienced in Microsoft BI technologies like SQL SSIS, SQL SSRS and SQL SSAS
  • Strong experience in Extraction, Data Migration, Transformation and Loading (ETL) data from various sources into DWH and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases using Informatica Power center 9.6.1/10.1/10.2 , IDQ
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ), creating data profiles, custom filters, data cleansing and developing score cards
  • Experienced working on Informatica transformations like normalizer, Lookup, Source Qualifier, Expression, Aggregator, Sorter, Rank and Joiner
  • Experienced in SQL, PL/SQL, T - SQL, DAX and UNIX shell scripting
  • Experienced in designing the Conceptual, Logical and Physical data modeling using ER Studio Data modeling tools. Experienced in developing meta-data repositories
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables
  • Expertise in working with relational databases such as Oracle 12c/11g/10g, SQL Server 2008, 2016 and 2018, DB2 8.0/7.0, Snowflake and MySQL. Strong experience in writing complex SQL Queries, Stored Procedures and Triggers
  • Experienced in using different scheduling tools - Control-M, Auto Sys, Maestro/TWS & Cron Job.
  • Performed extensive Data profiling and analysis for detecting and correcting inaccurate data from the databases and to track data quality
  • Experienced in Agile/Scrum, Water Fall methodologies and Tableau reporting tool
  • Experienced in resolving on-going maintenance issues and bug fixes, monitoring Informatica sessions as well as performance tuning of mappings and sessions

TECHNICAL SKILLS

  • C#, JASON, DAX, Structured Query Language, PL\SQL.
  • Windows and UNIX.
  • Informatica PowerCenter 9.6.1, 10.2.1, Informatica Data Quality 9.6.1, SSIS 2008R2/12, SSRS 2008R2, SSAS 2008/ 2012 Tabular Model, PDW Database, Data warehouse Tools, Salesforce, Azure Cloud and AWS Cloud.
  • IBM DB2, Oracle 10g and 11g, SQL Server 2008R2/2012, Vector wise, Snowflake.
  • TFS, GIT and CVS
  • Autosys, Control M, Maestro Scheduler and Cron Job.
  • Unix Shell script and AIX Shell script.
  • SQL Profiler, IBM Data Studio, TOAD, Putty, GitHub, WinSCP, SQL Configuration management, SQL DeveloperTFS - Team Foundation Server, JIRA, Bitbucket, Kintana, PVCS, Service Now, FileZilla and File Maker

PROFESSIONAL EXPERIENCE

Senior Associate

Confidential

Responsibilities:

  • Working with Sale and Marketing analyst to gather technical requirements and convert them into technical specifications of Informatica Data Integration & Snowflakes architecture and Oracle, which involves reviewing existing systems configuration and operating methods as well as understanding evolving business needs. Interact with Snowflakes architecture team and database administrators to provide a suitable design/solution to the customer needs
  • Lead and guide development of ETL architecture (Azure/Informatica/Snowflake) and Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design using Azure/ADF/ Informatica/ Snowflake/ UNIX script etc
  • Design and develop best practices with Informatica, Snowflakes, PL\SQL Scripts, Unix Shell scripts and Tableau reports to create integration channel between multiple different systems (EBIP, EPR and PDW2)
  • Informatica Integration system needs to understand the changed records and it has been loaded into corresponding table by slowly changing dimensions concept. Report system will populate the data into dashboards for business stakeholders where Snowflake’s cloud Data warehouse is the underlying system
  • Design the data modeling for fact and dimension tables for this project and present the logical data model. Do the data profiling using IDQ
  • Develop UNIX shell scripts, which will retrieve the files from source systems through SFTP shell commands and load the data into EBIP Oracle table using Import functionality. UNIX shell scripts extracts data as per downstream system functionality using Export functionality and SFTP'd them into Target system
  • Create a stored procedure and functions for data consumed from MDM Oracle system to load them into Fact and dimension tables
  • Created data integration which ingesting data from multiple external sources using Informatica Data Integration and loading the data to Snowflakes Cloud Data warehouse system. Develop with Informatica Data Integration to develop complex mapping for Snowflake’s cloud Data warehouse
  • Creating database objects such as tables, schemas, columns, primary/foreign constraints, joins, analytical functions, index, constraints, store procedures, triggers and transformations design details. Translating business requirements into technical solutions
  • Proactive participation in Business walk through calls and status calls. Involve in defect triage calls with business users, Q&A team & development team and provide Functional expertise on Sales & Marketing application. Update development and project status to clients on weekly basis.
  • Provided training to team members and assisted them in accomplishing results

Environment: PL\SQL, Informatica Power Center, Informatica Data Quality, Oracle 12c Snowflake, Salesforce, Unix Shell Scripting, Azure Data Factory, CVS, Vector Wise, Tableau

Senior Associate

Confidential

Responsibilities:

  • Have provided technical Solution to the customer by conducting walkthrough with the business and understanding the process involved in the business. Had involved in providing estimations based on the requirement from the business analyst, this has been the basis for planning the project
  • Interacted with METCARE and CBIT business and data analyst to gather technical requirements and convert them into technical specifications of Informatica Powercenter, Oracle and IBM DB2, which involves reviewing existing systems configuration and operating methods as well as understanding evolving business needs
  • Informatica Integration system needs to understand the changed records and it has been loaded into corresponding table by slowly changing dimensions concept. Report system will populate the data into dashboards for business stakeholders where CBIT Snowflakes cloud data warehouse is the underlying system
  • Developed UNIX shell scripts, which will retrieve the files from source systems through SFTP shell commands and load the data into METCARE DB2 table using Import functionality. UNIX shell scripts extracts data as per downstream system functionality using Export functionality and SFTP'd them into Target system. Interact with architecture team and database administrators to provide a suitable design/solution to the customer needs
  • Develop with informatica PowerCenter client tools like source analyzer, warehouse designer, mapping designer and transformation developer to develop complex mappings with joiner, lookup, filter, source qualifier, sequence generator, aggregator, and expression, update strategy transformations for CBIT Snowflakes Cloud Datawarehouse
  • Created data integration which ingesting data from multiple external sources using Informatica Data Integration and loading the data to Snowflakes Cloud data warehouse system. Develop with Informatica Data Integration tools to develop complex mapping for CBIT Snowflakes cloud Datawarehouse
  • Proactive participation in Business walk through calls and status calls. Involve in defect triage calls with business users, Q&A team & development team and provide Functional expertise on Metcare and CBIT application. Update development and project status to clients on weekly basis

Environment: PL\SQL, Informatica PowerCenter 9.6.1, Informatica Data Quality, IBM DB2 database, Salesforce, Unix Shell Scripting, Maestro Scheduler, UNIX, AIX

Senior Associate

Confidential

Responsibilities:

  • Worked closely with business stakeholders, multiple legacy system's stakeholders, data and business analyst to gather technical requirements and interacts with architecture team and database administrator to enable smooth IT co-ordination in providing a suitable design/solution to the Customers
  • Co-ordinated with technical\functional teams across multiple geographies to help design data models and configurations that need to be implemented in MDM (Master Data Management) and OneCRM platform and AWS service
  • Worked on number of activities that includes Analyze, Design and Develop best practices with Informatica - mappings, maplets, sessions, workflows and worklets, UNIX - shell scripts and PL/SQL Procedures to migrate the source (JSON format) data from Master Data Management (Oracle Cloud) to Salesforce One CRM application. He is supposed to execute unit test cases for each release and regularly promote code to production
  • Responsible for Configuring and scheduling Informatica jobs using Autosys scheduling tool. Proactively modified the existing Informatica mapping and PL/SQL code for enhancement of new business requirements
  • Interpolates in defect triage calls with business users; Q&A team & development team and provide Functional expertise on One-CRM Products. He helps to assist in ongoing development with technical best practices for data movement, data quality, data cleansing and other ETL, PL/SQL related activities. Active participation in Business walk through calls and scrum calls. Created Change Requests in JIRA for each enhancement from business and keep track of it till it get deployed into production
  • Participated actively in Business walk through calls and status calls. Involve in defect triage calls with business users, Q&A team & development team and provide Functional expertise on OneCRM application. Update development and project status to clients on weekly basis

Environment: PL\SQL, Informatica 10.2.1, Informatica Data Quality, Oracle 10g and 11g, Salesforce, UNIX Scripting, JIRA, Bit Bucket, Kintana, PVCS, AutoSys

We'd love your feedback!