Etl / Bigdata Solution Architect Resume
GA
SUMMARY:
- Total 12+ years IT experience in software development, with excellent skills primarily in ETL Architecture. Design & development of end - to-end Data warehousing (DWH) solutions, ETL Design using Informatica Power Center 9.x/8.x, Oracle Exadata, Oracle 11g/10g/9i, ETL process designing, Application programming in SDLC and Agile methodology.
- 11+ years of IT experience in designing Data warehousing applications, Data migration and Decision Support Systems
- Outstanding Experience on Data warehouse technologies and Data integration using ETL tool Informatica, Teradata and Oracle
- Achieved AWS badge on Big Data implementation.
- Expert in data extraction with REST based API JSON and xml real time payloads into DWH
- Implemented solutions with HDP-NiFi
- Spearhead the ETL solution architecture, design, development and implementation of analytical solutions. Also responsible for leading the analysis, comprehension and translation of business needs and requirements into creation and enhancement of logical data models, physical data models.
- Expert in designing and developing exception handling, restart logic and reload strategy for high volume Data loadings
- Expertise in multiple levels of dimensional loading like stage, data cleansing, Data Mart, Error Mart, DWH and data access/ODS
- Expert in designing end to end solution for data warehousing solutions and implementing ETL processes for SCD Type 1,2 dimensions, conformed dimensions, junk dimensions, facts, aggregations and references
- Expert in providing optimum ETL solutions by designing and utilizing parallel/partitioned processing from ETL as well as database level
- Experience in performance tuning and debugging of Informatica mappings/sessions
- Excellent experience in writing UNIX shell scripts on Sun Solaris and AIX-UX
- Have rich experience in areas of client server development, data warehousing, data marts, strategic reporting, analysis, data profiling and data cleansing techniques.
- Certified NCR Teradata for Teradata database V2R5
- Exposure to Business Object designer and Reporting tool.
- Hands-on experience in integrating ETL program components with various scheduling tool like Maestro, Event Coordinator and ControlM using UNIX shell scripts
- Handled various roles very successfully like ETL architect, Sr. ETL developer, Tech Lead, Onsite Coordinator (On-Shore Off-Shore Model), Systems Analyst, ETL Developer
- Possess wide range of IT industry experience with Telecom, Retail, Hospitality and Travel & Tourism domains with strong analytical, problem-solving, organizational, communication, learning and team skills. Experienced with coordinating cross-functional teams.
- Natural curiosity and a strong passion for empirical research and problem solving
- Excellent analytical, communication, organizational, and leadership skills
TECHNICAL SKILLS:
Data Warehousing: Informatica Power Center 9.6 / 9.5 / 9.0.1 / 8.6 / 8.5 / 8.1.1 / 7.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Power Connect, MS SQL Server 2000 Analysis Services, OLAP, OLTP, ETL, Autosys, Maestro, Data Cleansing, Data Profiling
BigData: HDP (Hortonworks Data Platform), Hadoop HDFS, Hive, HQL, NiFi, Sqoop, Oozie
Data Modeling: Dimensional Data Modeling - FACT & Dimensions tables, Logical & Physical data modeling, Star & Snowflake schema, Relational, dimensional and multidimensional modeling and De-normalization techniques, Erwin, Rational Rose.
Databases: Oracle Exadata, Oracle 11i/10g/9i/8i, Teradata V13/12, MS SQL Server 2008/2005
Languages/Web: SQL, PL/SQL, UNIX Shell Scripting
Other Tools: TOAD, Teradata SQL Assistant, SQL Developer, WinSQL, Putty, Winscp, Autosys, Maestro, Informatica scheduler
Version Control Tools: SVN, Harvest, IBM Lotus Quickr
Environment: UNIX, Windows 2000/2003/XP, Windows NT, Red Hat Linux 7.2
PROFESSIONAL EXPERIENCE:
Confidential, GA
ETL / Bigdata Solution Architect
Environment: Power Designer, Informatica 9.6, Oracle Exadata, TOAD, Shell Script, HDP (Hortonworks Bigdata Platform), Hadoop HDFS, Hive, NiFi, Sqoop
Responsibilities:
- Designed and created a process which extracts vod usage data from legacy oracle systems and ingest into Hadoop eco system.
- Designed and created a process which listen to on demand requests from client UI through HTTP posts and parse the message (JSON) to create dynamic queries to be executed in HIVE tables.
- Designed and created process in Apache NiFi to handle real-time requests and return the output to CCM within 120 seconds after receiving the request.
Confidential, GA
ETL / Bigdata Solution Architect
Environment: Power Designer, Informatica 9.6, Oracle Exadata, TOAD, Shell Script, HDP (Hortonworks Bigdata Platform), Hadoop HDFS, Hive, Sqoop
Responsibilities:
- Designing the solution with Big Data HDP ecosystem and designing ETL process with Informatica.
- Designing the data flow orchestration as data flows from multiple systems at every 15 min interval.
- Designed file configuration system to track all the files for upstream and downstream systems. Process handles nearly 300 files with different metrics. File configuration handles it dynamically based on no. of scenario to be processed.
- Designed UNIX shell scripts to provide additional functionality to Big Data process.
- Managing Communication with Customer giving valuable suggestion for business logic and in corporation of change business process in ETL as well as in Database.
- Created efficient HDFS folder hierarchies for data keeping and for Hive tables.
- Providing leadership and work guidance
Confidential, GA
ETL Architect
Environment: Power Designer, Informatica 9.6, Oracle Exadata, TOAD, Shell Script, KafKa REST API
Responsibilities:
- Performed as ETL Architect and ETL Tech Lead. Worked closely with Reporting/Business Analysts, Reporting developers, Data Modelers, source systems owners and ETL developers to ensure design and implementation of optimal DW/BI solutions.
- Designing Technical Specification documentation relating to warehouse system design and implementation: Technical Architecture Specifications, Source to Target Mappings, Data Flow Diagrams, and Operational Manuals.
- Coordinating with development/admin resources across both onsite and offshore locations to implement, maintain, improve, and support solutions according to agreed BI/DW strategy. Providing leadership and work guidance.
- Performed source data analysis, data discovery, and data profiling required for data modeling.
- Troubleshooting issues and provided solutions. Identifying resolution to application functional and processing issues related to development code and database functionality. Perform performance tuning on low through-put sessions
- Working with DBAs to perform database tasks, help to troubleshoot and resolve production issues related to company processes and data integrity, perform issue root cause analysis and provide downtimes support as needed
- Develop, test and implement data integration solutions for Data Warehouse using Informatica, UNIX ksh scripts, PL/SQL and other ETL tools and technologies.
- Providing optimum solutions for ETL processes. Using partitioning strategy and utilizing Grid to improve the performance of ETL process. Used Oracle DB partitions (Range, list partitions and sub-partitions) to add the parallel functionality.
- Assignment of ETL, Unit & integration testing project tasks
- Assist Business with UAT test case and script creation
- Create/ Review Change Management tickets for the DW team for promotion of objects to Production environment
- Review of applications (objects) developed by the offshore team and make sure that they meet the business requirements and follow the required quality processes on a day-to-day basis.
Confidential, FL
ETL Architect
Environment: ERwin, Informatica 9.5, Oracle 11G, TOAD, Shell Script
Responsibilities:
- Performed as ETL Architect and ETL Tech Lead. Worked closely with Reporting/Business Analysts, Reporting developers, Data Modelers, source systems owners and ETL developers to ensure design and implementation of optimal DW/BI solutions.
- Designing Technical Specification documentation relating to warehouse system design and implementation: Technical Architecture Specifications, Source to Target Mappings, Data Flow Diagrams, and Operational Manuals.
- Coordinating with development/admin resources across both onsite and offshore locations to implement, maintain, improve, and support solutions according to agreed BI/DW strategy. Providing leadership and work guidance.
- Troubleshooting issues and provided solutions. Identifying resolution to application functional and processing issues related to development code and database functionality. Perform performance tuning on low through-put sessions
- Working with DBAs to perform database tasks, help to troubleshoot and resolve production issues related to company processes and data integrity, perform issue root cause analysis and provide downtimes support as needed
- Develop, test and implement data integration solutions for Data Warehouse using Informatica, UNIX ksh scripts, PL/SQL and other ETL tools and technologies.
- Providing optimum solutions for ETL processes. Using partitioning strategy and utilizing Grid to improve the performance of ETL process. Used Oracle DB partitions (Range, list partitions and sub-partitions) to add the parallel functionality.
- Assignment of ETL, Unit & integration testing project tasks
- Assist Business with UAT test case and script creation
- Create/ Review Change Management tickets for the DW team for promotion of objects to Production environment
- Review of applications (objects) developed by the offshore team and make sure that they meet the business requirements and follow the required quality processes on a day-to-day basis.
Confidential, FL
ETL Architect
Environment: ERwin, Informatica 9.5, Oracle 11G, TOAD, Shell Script
Confidential Implementation:
Responsibilities:
- Creating Functional Data Warehouse (DW) analysis & design documents
- Designing Extract, Transform & Load (ETL) Technical Specifications.
- Manage the offshore resources on a day-today basis
- Assignment of ETL, Unit & integration testing project tasks
- Responsible for offshore deliverables
- Maintain production Enterprise Data Warehouse environment
- Perform performance tuning on low through-put sessions
- Works with DBAs to resolve database performance issues
- Create QA Test Plans.
- Assist Business with UAT test case and script creation.
- Create/ Review Change Management tickets for the DW team for promotion of objects to Production environment
- Review of applications (objects) developed by the offshore team and make sure that they meet the business requirements and follow the required quality processes
Confidential, FL
Tech Lead
Environment: Informatica 9.1, Oracle 10g, TOAD, Ms-Excel, SQL, Pl/SQL, Shell Script
Responsibilities:
- Create Functional Data Warehouse (DW) analysis & design documents
- Create Extract, Transform & Load (ETL) Specifications (Source to target and Business rules)
- Create ETL mappings to meet business requirements
- Manage the offshore resources on a day-today basis
- Assignment of ETL, Unit & integration testing project tasks
- Responsible for offshore deliverables
- Maintain production Enterprise Data Warehouse environment
- Perform performance tuning on low through-put sessions
- Works with DBAs to resolve database performance issues
- Designing QA Test Plans, Test Cases, & Test Scripts
- Assist Business with UAT test case and script creation.
- Create/ Review Change Management tickets for the DW team for promotion of objects to Production environment
- Review of applications (objects) developed by the offshore team and make sure that they meet the business requirements and follow the required quality processes
Confidential, FL
Tech Lead
Environment: UNIX, Windows, Shell Script, Informatica, Oracle, TOAD, Ms-Excel
Responsibilities:
- Study and Understand current system
- Design ETL Architecture to process for the population of Dashboard from different sources.
- Describe the high level conceptual ETL model.
- Identifying any Technical Architecture risks and assumptions.
- Leading the team, provide technical sol to team mates and coordinating team efforts.
- Involved in Creating, testing and debugging ETL mappings using Informatica.
- Quality testing of ETL process and performance.