We provide IT Staff Augmentation Services!

Enterprise Architect Resume

SUMMARY:

  • Solutions Architect/Big Data Developer with 10+ years of experience in Data Warehousing, Data Mining, ETL/ELT, Business intelligence, Data Integration, Data Quality, Data Mining, Data Governance, Metadata and Big Data.
  • HortonWorks Certified Hadoop Developer with 3+ years of experience in data migration to Hadoop data lake, Pig Latin scripts, Hive, Sqoop, Oozie, Ambari, Yarn, Zookeeper, MapReduce, Flume, HCatalog, Hue.
  • Microsoft Certified Cloud Data Platform Solutions Architect
  • Worked closely with Business users and Gained Business Acumen, Prepared BRD and PR documents
  • Strong understanding of enterprise data warehouse architecture. Responsible for data modelling design using ERwin/Power Designer. Extensive experience in analysis, requirements gathering, design, and creation of technical documentation for OLTP and OLAP systems
  • Responsible for business analysis processes. Communication with stakeholders, Business and Technical teams to prepare business requirement and technical documents as per standards
  • Expertise in writing Pig Latin scripts to sort, group, join and filter the data as part of data transformations as per the business requirements
  • Expertize in handling different optimization join operations like Replicated Joins, Skewed Joins and Merge Joins
  • Expertize on extending HIVE and PIG core functionality by using custom UDFs
  • Efficient in working with Hive data warehouse tool creating tables, data distributing by implementing partitioning and bucketing, writing and optimizing the HiveQL queries
  • Expertize in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice - versa.
  • In-depth knowledge in Data Analysis, Source Data Profiling, Requirement analysis
  • Hands-on experience in designing and developing Master Data Management Solutions
  • Experienced in ETL Design, ETL Development, ETL Testing, Release management and maintenance Phases of Project Life cycle
  • Experienced in ETL Optimization and Performance tuning
  • Providing solutions for Performance tuning across Enterprise Data Warehouse for faster data retrieval
  • Designed and Developed automated I/O feeds for Advanced Analytic Data Preparation, Data sourcing needs in Linux that interacts with SAS E-Guide, SAS E-Miner Predictive Analytic Code
  • Experienced in handling a team of 20 members across onsite and offshore

TECHNICAL SKILLS:

Big Data Technologies: Pig, Hive, Sqoop, Oozie, Ambari, Yarn, Zookeeper, MapReduce, Flume, HCatalog, Hue

Hadoop Distribution: Hortonworks

Programming Languages: SQL, PL/SQL, Unix Shell Scripting, C, C++, Java.

Databases: HP Vertica, Oracle, NeoView, Teradata, Oracle

ETL Tools/ MDM: Informatica PC9.x/8.x/7.x/6.x,DIAL (HP ETL tool), Talend MDM, Talend DI

Operating Systems: Linux, Windows 98/NT/2000/XP/7/8

Reporting Tools: YOTTA (HP Reporting tool), Business Objects

Advanced Analytics: Base SAS Programming, SAS E-Guide, SAS E-Miner and SAS Visual Analytic

Methodologies: Agile, Waterfall

Cloud Data Platform: Microsoft Azure, Azure SQL Server

PROFESSIONAL EXPERIENCE:

Confidential

Software: HDP 2.3, Talend DI, CentOS, Java

Enterprise Architect

Responsibilities:

  • Installed and configured internal Hadoop cluster
  • Working closely with Big Data Architects on design of BDIT (Big Data Integration Tool).
  • Performed internal POC on Talend DI integration with HDP.
  • Wrote extensive hive jobs as well as Pig Latin scripts for data loading/manipulation.
  • Created scoop jobs to export/import data from different RDBMS (SQL Server, Oracle, Teradata)
  • Working as Hadoop Trainer focused on training big data skills to internal employees.
  • Installed HDP cluster on AWS for team assignments.
  • Created solution proposal for different clients.

Confidential

Software: Talend MDM, PeopleSoft, Salesforce, ServiceNow, Tableau, Hyperion

MDM Architect

Responsibilities:

  • Worked as a Senior Enterprise Architect for Product Master Data Management Solution.
  • Gained a thorough knowledge on organization wide product data life cycle and performed product data analysis.
  • Requirement analysis includes source system study, current state architecture, High level data flow diagrams and Data Integration feasibility.
  • Identified appropriate MDM solution based on the business needs.
  • Performed Proof of Concept Integrating Salesforce product data into Talend MDM along with Data Synchronization and Data Publication.
  • Delivered Proof of Concept, MDM Requirement and Proposal Artifacts.
  • Created the logical and physical model - Source to Target mapping document

Confidential

Software: HDP 2.1, Unix Shell Scripting, DIAL, YOTTA, HP Vertica, SEAQUEST, Base SAS Programming, E-Miner, E-Guide, Unix Shell Scripting, Informatica 9.5

Data warehouse Architect

Responsibilities:

  • Worked with the business team and technical team to identify the source systems/application to be migrated to Hadoop Data Lake.
  • Identifying ETL requirements changes for HPE and HPI.
  • Prepared Design Specification, performed data analysis.
  • Created Data Warehouse Star Schema design and dashboards having drill-down, forecasting and regression analysis
  • Performance tuning and code optimization
  • Coordinated with source system for Inbound Specification changes, development, SIT/UAT Testing.
  • Created Data Warehouse Star Schema design and dashboards having drill-down, forecasting and regression analysis
  • Wrote the Sqoop code/Sqoop jobs to load aggregated data from Hadoop to Vertica Database
  • Designed and developed Automation solution for stand-alone E-Miner Predictive Engine Code using HP Vertica, Unix, BASE SAS and Informatica.
  • Closely worked with Data source team and Designed and developed ETL code and Data Preparation Code for Third Party Maintainer and Data Center Care predictive engine.
  • Automated Input and Output feeds to SAS E-Miner and prediction results back to QlikView Dashboards.
  • Providec technical assistance in delivering the code, review them and ensure adherence to standards & best practices

Confidential

Software: Informatica 9.x, DIAL ETL, YOTTA reporting, BO reporting, NeoView, SQL Server, HP Vertica

Technical Lead

Responsibilities:

  • Analyzed and understanding the project requirement.
  • Prepared ETL design specifications and Development.
  • Wrote complex ETL mappings using Informatica PC.
  • Review ETL code and ensure adherence to standards & best practices
  • Performance tuning recommendations for ETL Mappings and Oracle queries
  • Interacted with end users during UAT testing.
  • Suggestions on performance tuning and code optimization of DB through Vsql Scripts.
  • Trained project internal/external team on HP proprietary ETL tool sets.

Confidential

Software: Informatica 7.1.3, Oracle 9i, Teradata, UNIX Shell Scripting

ETL Specialist - Consultant\System Analyst

Responsibilities:

  • Analyzed and understanding the project requirement
  • Developed ETL mappings using Informatica PC
  • Performed unit testing and code migration
  • Worked extensively on ETL change request and maintenance

Hire Now