End To End Etl Designer, Data Analyst, Etl & Bi Developer And Project Lead Resume
SUMMARY:
- Having around 11 years of Software professional experience in Data Warehousing and Business Intelligence(DW&BI) as a ETL Developer, Data Analyst, End to End Designer, Team lead, Technical lead and Project Lead in multiple projects across geographies.
- Well versed in various phases of SDLC that includes Requirements Gathering, Analysis, Design, Development, Testing, Implementation, Production Code Support and Maintenance for end - to-end IT solution offerings.
- Good knowledge in Data warehousing concepts, Informatica (ETL, IDQ and MDM), Teradata, Oracle SQL, SQL server, Data science, Machine learning using R programming, Python, Big Data, Qlik view, Actuate, Cognos and Talend Data Integration.
- Solid skills in data discovery, data analysis, data profiling, data compiling, data transformation mappings, data integration, data quality control, data governance and data validations.
- Established, maintained, and enforced ETL architecture design principles, techniques, standards, and best practices. Prepared High Level Design (HLD) and Low Level Design(LLD) documents
- Proven track record in employing ETL tools - Informatica and Talend - for data quality, data transformation, mapping and integration in high end complex warehousing and migration projects with extensive experience in Oracle, SQL Server and Teradata databases.
- Experienced in conceptual, logical and physical data modelling for OLTP, OLAP and Enterprise Data Warehouses and Data Marts.
- Good experience in Data profiling on the business data to identify data anomalies and define/create rules to cleanse data using Informatica Data Analysis and Developer tools.
- Experience in data Standardization, data cleanse and enrich to resolve the Completeness, Conformity consistency and accuracy using Informatica Data Quality (IDQ).
- Managed technical designs of ETL reference architectures to ensure high data quality, data integration performance, and error recovery/handling, optimize performance.
- Pioneered in different load strategies from heterogeneous sources to target. Successfully implemented SCD Type1/Type2 load, Capture Data Changes to maintain the Data history and Optimization techniques.
- Experienced as Database SQL Developer in Teradata, Oracle 9i and SQL server to Recover the data, Performance Tuning, Table Partitioning, Database Architecture, Monitoring, and Database Migration.
- Excellent to write the Stored Procedures, Triggers, Indexes, Functions by PL/SQL, SQL Scripts and BTEQ scripts. Developed various reports, dash board using Actuate reporting tool and Tube maps using Qlik View.
- Experienced in Release management activities and Migration of codes from Repository to Repository, prepared Technical/Functional Specification Documents along with unit test and validation documents.
- Lead the design and implementation of the following processes: business requirements gathering, information access data discovery, ETL and SQL coding, code reviews, data quality control, testing, release schedules/requirements and delivery through BI and analytics platforms.
- Good Domain Knowledge in Banking and financial services - Confidential Banking, Retail banking, Card services and Insurance.
- Strong technical, managerial, leadership and interpersonal skills.
- International work experience, working face to face for clients in USA, India and UK at client locations. Skilled to interact with business users and various stakeholders.
- Excellent problem solving, communication skills with ability to clearly explain complex technical aspects into easily understandable concepts.
TECHNOLOGIES:
ETL Tools: Informatica PowerCenter, Informatica Data Quality (IDQ), Informatica MDM, Talend Data Integration
Database: Oracle, SQL-Server, Teradata
Reporting Tools: Cognos, Actuate, Qlik View
GUI Tools: Teradata SQL Assistant, Queryman, BTEQ, TOAD, SQL Plus, Putty, WinSCP, R-Studio, Anaconda.
Languages: SQL, PL/SQL, Python, R- programming, C, C++, VHDL, Verilog.
Operating Systems: Windows 93/95/98/ME/NT/XP/ 7/8, Vista, Unix, Mac.
Scheduling Tools: Tivoli, Control-M.
PROFESSIONAL EXPERIENCES:
Confidential
End to End ETL designer, Data Analyst, ETL & BI developer and Project lead
Responsibilities:
- Performed an extensive data analysis on the current data movement process and procedures in databases.
- Involved in identifying the data patterns, data cleansing, design, develop, test and implement the models to provide valuable data insights to business users.
- Performed the impact analysis for tokenizing sensitive data and SRP changes on the existing applications.
- Prepared the requirements traceability matrix documents capturing the functional and non-functional requirements.
- Managed technical designs of ETL reference architectures to ensure high data quality, data integration performance, and error recovery/handling, optimize performance
- Performed Feasibility, Adaptability and Risk Analysis to identify the business critical and high-risk areas of the Application.
- Conducted Gap Analysis by creating Swim lane diagrams and highlighted the differences between the Current (AS IS) model versus the Target (TO BE) model.
- Presented the strategic and tactical solution approaches to the project teams to easily decide the best approach to follow under the stringent project timelines.
- Prepared the High level design documents with clearly capturing the design constraints, risks, assumptions, dependencies and issues.
- Selected the appropriate elicitation technique to efficiently identify critical requirements and High level designs by conducting multiple walkthrough sessions with the various stakeholders, Design Authorities, vendor and CIO’s for review and sign off’s.
- Managed technical designs of ETL reference architectures to ensure high data quality, data integration performance, and error recovery/handling, optimize performance. Supervised the data warehousing functions of data compilation, extraction, loading, cleansing and modelling.
- Ensure the tokenization design for card data is PCI-DSS compliance and align to the GIS (Global Information Security) administration functions.
- Integrate multiple project such as data quality, metadata management, reference and Master data to Enterprise class structures.
- Performed Data Quality management which include leading assessment of the different domain areas, conducting data profiling, developing DQ dash boards, develop remediation plans and execute as appropriate.
- Built the Telephony data model for Confidential calls to improve the automated machine call answering process for better business needs.
- Performed analysis to identify data anomalies, data cleansing (ETL) and resolved data quality issues using IDQ.
- Data exploration and transformation, created Teradata BTEQ scripts and store procedures to load the Call data model on daily and monthly basis.
- Project management experience includes day to day interaction with client to report progress on existing projects and discuss the feasibility of new projects, business impact analysis/risk analysis, prepare status & metric reports and manage & train junior associates on the team.
- Worked on dashboard designs, architect & built tube map view using Qlik View, projecting the daily/weekly/monthly/quarterly call volumes & call journeys through the mortgages- telephony waterfall model.
- Development of Teradata SQL procedures to in corporate complex business logic and load the daily Call data into the telephony data model. Created weekly reports, provided statistics and graphs to the business users.
- Project Plan Preparation, scheduling all project activities and conducting weekly Status Conference with the Client Project Management team.
Environment: - Informatica PowerCenter 9.6, Informatica MDM & IDQ 9.6, Teradata, SQL, PL/SQL, Putty,UNIX ShellScripting, TWS (Tivoli workflow Scheduler)Big-Data, Hadoop Data lake, Protegrity Tokenisation- FPG, AP and ESA appliances.
Confidential, Texas
Technical lead
Responsibilities:
- Lead the Confidential Data Integration Hub (DIH) team for building a centralized source of truth across the Confidential enterprise for all reporting needs in Confidential banking.
- Performed business impact analysis & feasibility analysis of proposed project, processes and provide solutions.
- Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers and data analysts, peer reviewed their development works and provided the technical solutions.
- Proposed ETL strategies based on requirements. Conducted business risk analysis to identify and develop effective risk mitigation strategies.
- Review and revise project plans to meet challenging requirements on the project.
- Interacted with business users to gather and document requirement specifications.
- Extensively worked on data analysis, data profiling, data cleansing and data validation processes to ensure no gaps in data and maintain data integrity/quality and governance.
- Data profiling and remediation for inconsistency, completeness and duplicates was done through SQL scripts.
- Involved in architectural analysis, risk assessment, design, development and testing activities of the project.
- Created store procedures & queries for the complex business logic to accomplish the business reporting needs.
- Additional initiative undertaken in conducting knowledge sharing sessions amongst the team for quality deliverables and to have better understanding on Confidential banking terminology to the new entrants.
- Peer-reviewed the code to cross-check if the logics are accurate or not to meet the business requirements and client standardization based on Tech Specs, fixed them if there are any discrepancies.
- Identify the feasible alternative approaches, systems and equipment reduce cost, improve efficiency while meeting the expectations.
- Coordinate multiple DIH Confidential Banking releases throughout the year which requires tracking the development status, review, release management and deployment of code. Executing Scripts / Managing scheduled activities based on business requirements.
- Mentored individual team members for improved performance, manage personal development plans for all direct reports and define a clear career path.
- Project Plan Preparation, scheduling all project activities and weekly status conference with the Client Project Management teams.
- Communication and deliverables between Confidential onsite-offshore and work with the client teams.
- Developed and managed creation of product documentation to communicate features, benefits, positioning and impacts to different audiences and ensured that documentation deliverables from various groups are produced as needed.
Environment: SQL Server 2008, SSIS, SSRS, Cognos, PL/SQL, SQL Plus, Putty, UNIX Shell Scripting, SQL-Developer.
Confidential, Texas
ETL Team lead
Responsibilities:
- Involved in meetings with business to gather information, requirements and finalize DQ Rules with Informatica DQ Analyst Tool and Developer Tool.
- Involved in Design and Development of Exception Handling process for Business and Field level validations.
- Involved in DQ Management Process Profile, Collaborate, Standardize/Match/Consolidate and Collaborate/Monitor
- Well versed with Rule and Column Profiling to analyse the data inconsistencies Completeness, Conformity, Consistency, Accuracy, Duplicates and Integrity .
- Created Scorecards, Join Analysis and other reports based on the data showing trend of the quality issues.
- Involved in Scorecard creation to gauge the quality of data and Reference Tables Managed, Unmanaged to standardize data.
- Developed Case Converter, Merge, Standardizer and Parser Transformations to enrich and standardize data.
- Built several reusable components on IDQ using Parsers, Standardizers and Reference tables which can be applied directly to standardize and enrich Address information
- Involved in analysis and fine tuning of Match and Merge strategy based on scope of requirement and the data. Created Match and consolidation strategy for complex Requirement.
- Worked extensively on Informatica push down optimization and Teradata indexes to support the extensive load volume efficiently.
- Created Teradata DDL and DML scripts for Tables, Views, Transaction Tables, Triggers and Store Procedures for base tables and CDC processes in all layers.
- Tested mappings, workflows and sessions to figure out the bottleneck to tune them for better performance. Prepared effect Unit, Integration and System test cases for various stages to capture the data discrepancies/ inaccuracies to ensure the successful execution of accurate data loading.
- Extensively worked on CDC to capture the data changes into sources and for delta load. Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions.
- Migrated the codes from Dev to Test and Test to Prod. Wrote the migration documentation in details for system compatibility, objects and parameter files to smooth transfer of code into different environments.
- Designed the automation process of Sessions, Workflows, scheduled the Workflows, created Worklets and configured them according to business logics & requirements to load data from different Sources to Targets.
- Designed and developed Jobs by Unix Scripts for Control-M to schedule workflows. Created Pre-& Post-Sessions UNIX Scripts, SQL Functions, Triggers and Stored Procedures to drop & re-create the indexes and to solve the complex calculations on data.
- Guided the other developers and provided the technical solutions in need, peer reviewed their codes to meet the accurate business requirements and project standardization.
- Conducted daily/weekly meetings, monitored the work progresses of teams and proposed ETL strategies.
- Coordinate multiple Confidential Banking releases throughout the year for ETL team with requires tracking the development status, test cycle executions, review, release management and deployment of code.
- Involvement in imparting domain and technical knowledge to the new entrants and also fellow team members.
Environment: - Informatica PowerCenter 9.x, IDQ 9.x,Teradata 13, SQL, PL/SQL, SQL Assistant, Putty, WinSCP, UNIX Shell Scripting, SQL-Developer, JIRA, Control-M, SVM.
Confidential, Texas
Development and Production Support Team lead
Responsibilities: -
- Performed Project Requirement Analysis, System Study and Estimation.
- Guide and supervise ten onsite and off-shore resources.
- Requirement gathering for all new development and enhancement projects.
- Automation of existing manual jobs and processes reducing manual efforts.
- Continuous tracking of production issues and provide solutions to prevent recurring issues.
- Coordination between client business leads and Confidential development team to ensure quality services.
- Involved in monitoring of production ETL applications and Database scripts, Production Support through tickets. Executing Scripts / Managing scheduled activities based on business requirements.
- Continuous performance tuning of production jobs providing improvement in the completion times for critical production jobs.
- Project Plan Preparation and Scheduling all enhancement and development activities.
- Worked with team to develop in-house knowledge repository for best practices, solution documentations, manuals, and procedures for user education
- Prepare the Proof-Of-Concepts (POC) at the client environment and pass them to the developers and the implementation of those in the respective client environments.
- Communicate deliverables between Confidential onsite-offshore and work with the clients.
- Provide reports on open/closed request segregated based on severity/priority levels and Analysis of the internal systems used at JPMC.
- Guide team members in all technical challenges faced in the execution of project
- Knowledge Management to create repository of system/business/domain knowledge
- Involvement in implementation of the code changes to the COAMGR source system programs and jobs.
- Reverse engineer the existing ETL code to identify the transformation/business logic.
- Effort and Schedule Tracking, Metrics Collection, Analysis and Reporting.
- Responsible to ensure 24*7 production support for the project.
- Provide statistical reports to business on various metrics like SLA misses during the month, number for production issues faced during the month etc.
- Induction of new resources to team and knowledge transition
- Knowledge Management to create repository of system/business/domain knowledge
- Weekly Status Conference with the Client Project Management
- Internal/External Quality Audit Representative of Project.
- Ensure adherence to Confidential /JPMC Standards in Release/Change Management.
- Primary escalation point for all issues in JPMC Default MIS COAMGR project.
Environment: - Informatica(8x,9x), Oracle SQLPlus, PL/SQL, SQL developer, TOAD, Unix, WINSCP, PUTTY, Microsoft SQL Server, Cognos, Control M scheduler.
Confidential
Senior Application Developer
Responsibilities: -
- Analysed the data based on requirements, wrote down the techno-functional documentations
- Designed and developed various complex SCD Type1/Type2 Informatica mappings in different layers, migrated the codes from Dev to Test to Prod environment.
- Tested mappings, workflows and sessions to figure out the bottleneck to tune them for better performance.
- Prepared Unit, Integration and System test cases for various stages to capture the data discrepancies/ inaccuracies to ensure the successful execution of accurate data loading.
- Developed mappings using Informatica Data Quality (IDQ) to remove the noises of data and performed the unit testing for accuracy of data.
- Used IDQ to complete initial data profiling and removing duplicate data. Involved in analysis and fine tuning of Match and Merge strategy based on scope of requirement and the data. Created Match and consolidation strategy for complex Requirement.
- Guide team members in all technical challenges faced in the execution of project
- Responsible to transform and load of large sets of structured data from heterogeneous sources
- Prepared ETL technical Mapping documents along with test cases for each Mapping for future developments to maintain SDLC
- Created reusable components like user defined functions, mapplets, store procedures etc.
- Wrote various Functions, Triggers and Stored Procedures to drop, re-create the indexes and to solve the complex calculations
- Coordinating with the onsite team daily to capture the requirements changes in the ETL code.
- Producing bug free codes and error free documents and contributing in delivering products well within schedule.
Environment: - Informatica PowerCenter 8.6, Informatica Data Quality (IDQ), Rapid SQL7.2.1, SQL, PL/SQL, My SQL, Putty, WinSCP, UNIX Shell Scripting, SQL-Developer, Python.
Confidential
Application Developer
Responsibilities:
- Participated in various stages of data and requirement analysis for project needs.
- Developed a prototype based on Teradata FSLDM architecture before the project start.
- Developed and tested mappings, workflows, BTEQ, Fastload, Fastexport and Multiload scripts, which handles delta logic as per the architecture.
- Preparation of the Proof-Of-Concepts (POC) at the client environment and pass them to the developers and the implementation of those in the respective client environments.
- Implementation of POC using pushdown optimization in Informatica for performance tuning
- Creating/modifying DDL (Data Definition Language), choosing UPI (Unique primary index), PPI (Partition primary index) and SI (Secondary Index)
- Implementation of the Error and Audit Model specifically designed for the EDW Integration projects under stringent time lines.
- Performed key data profiling steps that identified the mapping issues related to historical conversion needs for Hierarchy/GL trending reports.
- Designed the automation process of Sessions, Workflows, scheduled the Workflows, created Worklets and configured them according to business logics & requirements to load data from different Sources to Targets.
- Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions. Involved in fixing the invalid Mappings.
- Involved in developing a comprehensive system Test Plan and created the Test Cases.
- Created Pre & Post-Sessions UNIX Scripts to merge the flat files and to create, delete temporary files, change the file name to reflect the file generated date etc
- Ensure the code is in adherence to Confidential /WaMu Standards.
Environment: - Teradata V2R6.2, Informatica 8.1, Teradata Utility tools (BTEQ, Fastload, Fastexport, Multiload and TPUMP), Putty, WinSCP, UNIX Shell Scripting, Teradata SQL Assistant.
Confidential
Application Developer
Responsibilities:
- Responsible for Physical Implementation of Teradata system which includes converting Oracle DDL to Teradata DDL using Oracle’s data dictionary tables
- Participated in various stages of data and requirement analysis for project needs.
- Designed the automation process of Sessions, Workflows, scheduled the Workflows, created Worklets (command, email, assignment, control, event wait/raise, conditional flows etc) and configured them according to business logics & requirements to load data from different Sources to Targets.
- Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions. Involved in fixing the invalid Mappings.
- Involved in developing a comprehensive system Test Plan and created the Test Cases.
- Implemented a Bug free Informatica code to migrate data from Oracle to Teradata database.
- Analysed and worked on the fastload scripts by considering performance factors into account to move data from ODS to staging area and staging area to target area respectively
- Created Pre & Post-Sessions UNIX Scripts to merge the flat files and to create, delete temporary files, change the file name to reflect the file generated date etc
- Prepared detailed Functional Specification/Design and Review documents.
- Wrote SQL queries to test the application for data integrity.
- Ensure the code is in adherence to Confidential /WaMu Standards.
Environment: Informatica Power Center 7.2, Teradata V2R6.2, Oracle 10g, PL/SQL, Toad, Query Man, UNIX Shell Scripting, BTEQ.