Solution Architect - Information Management Resume
CA
SUMMARY
- Business Intelligence | Data Integration Solutions | Big Data Solutions | NoSQL Technologies | ETL | Complex SQL | Data Warehousing | Data Quality | Data Profiling | Data Capturing & Storing | Quality Assurance | Staff Training & Mentoring | Leadership & Strategy | Project Management | Front - End Software ProductHands-on Information Technology expert with over 17 years of experience specialized in building and managing data platforms such as BI, ETL, DW and Big Data applications and teams.
- Provided effective end-to-end solutions in BI/ETL/DW/Big Data implementations, prepare patterns & frameworks to deliver reliable and cost effective implementations.
- Streamlined operational efficiencies by building data platforms using emerging technologies including Hadoop, Hive, Spark, Splunk, and NoSQL databases such as Cassandra to deliver end-to-end data architecture solutions.
- Architect, design and develop BI, ETL, DW solutions for multiple business units
- Recommend new metrics, techniques, and strategies to improve teh Business Intelligence landscape
- Delivered reporting solutions with operational reporting, enterprise dashboards, secured multi-tenant OEM platform etc for retail, healthcare, manufacturing and financial clients using BI tools such as MicroStrategy, Tableau
- Hands-on with Big Data Components/Frameworks such as Hadoop, Spark, Pig, Hive, Sqoop, Flume
- Experienced in NoSQL databases such as Cassandra, MongoDB.
- Evaluating and applying new emerging technologies for effective solution to meet end user needs
- Designed and developed database programs in MPP database environments Teradata, Netezza and Vertica.
- Experienced in building Data Science teams for effective analytics implementations
- Built data integration platforms with high data volumes from varied data sources, including click stream data, web logs and structured/relational data.
- Minimized workflow lags by hiring top talent, training staff on best practices and protocol and managing team to ensure optimal productivity.
- Cultivated rapport with third party professional services on boarding and implementation processes.
TECHNICAL SKILLS
Big Data: Hadoop, Hive, Impala, Pig, Sqoop, Spark, Scala, Splunk, Cassandra, MongoDB, HBase
BI Reporting: Microstrategy, Tableau, Business Objects
Databases: Teradata, Netezza, Vertica, Oracle, DB2, MS SQL Server, Sybase
ETL: Informatica PowerCenter, snapLogic, Pentaho, SQL
ERP: SAP, Workday
Programming: C#, Java, Shell
PROFESSIONAL EXPERIENCE
Confidential
Solution Architect - Information Management
Responsibilities:
- Provided Big Data solutions to an agricultural biotechnology company. Responsible for delivering teh solution dat handles terabytes of data. Teh solution mainly included Hadoop, MapReduce, Hive, HBase, Tableau.
- Provided Data Integration solutions to a managed health care company dat markets and administers health insurance in teh United States. Processed billions of records and terabytes of data on monthly basis for calculating and assigning health indicator for all their customers. Data is used for research by their R&D team. Responsible for building teh team, mentoring and guiding them for on-time delivery of teh project.
- Designed and performed POCs using Big Data technologies as part of pre-sales activities
- Provided User Behavior Analytics solution on teh usage of various applications using Splunk.
- Provided reporting and analytics solutions to a leading provider of software and services to advance healthcare quality and safety performance. Its Software-as a-Service (SaaS)-based applications halp over 3,000 healthcare facilities improve their care environment, resource productivity and financial performance.
- Delivered multi-tenant analytics platform to serve 1000s of users with security built-in at various levels for separating teh healthcare related sensitive data and a scalable and sustainable data platform for near-real time reporting needs.
- Responsible for delivering dynamic dashboards, standard reports and ad-hoc reporting capabilities with reporting application integrated in Java based applications with single sign-on and security filters built via Microstrategy SDK.
Technical Environment: Hadoop, Spark, HDFS, Hive, Sqoop, Flume, CDH 5.x, MicroStrategy 9.3, Informatica PowerCenter 9.5, SnapLogic, Tableau, Teradata, Netezza, Vertica, Oracle 11g, and Shell Scripting.
Confidential
Solution Architect
Responsibilities:
- Prepared data and designed/architected data integration strategies and key deliverables, including end-to-end solutions for data warehousing and business intelligence systems for various high-profile customers.
- Reduced implementation gaps and streamlined backend data integration, Data Models, database to support BI.
- Provided MicroStrategy Architect solutions for efficient schema development
- Built reports and dashboards for customers across various domains
- Provided solutions to multiple BI migration projects
- Redesign of models in Teradata and implementation of BI applications on top of teh new database.
- Improved data retrieval times and functionality by designing and architecting data platforms.
- Delivered data integration solutions using Informatica on cloud environment.
Technical Environment: MicroStrategy, Teradata, Hadoop, Hive, Impala, Tableau, Informatica PowerCenter, Informatica on Cloud, MS SQL Server, Shell Scripting and Linux.
Confidential, CA
TEMPPrincipal Architect, Data Intelligence COE
Responsibilities:
- Partnered with diverse architects and served as team lead, managing end-to-end implementation of teh global subscription center (marketing), omniture (web traffic/analytics), and scoping various ETL processes.
- Coordinated full lifecycle project implementation of various projects in teh data warehousing space, including pre-sales, project scoping, requirements gathering, analysis, design, implementation and production support activities.
- Spearheaded data architecture and ETL architecture and design for multiple projects.
- Extensively used Teradata BTEQ scripts, Macros and coded for Teradata loader utilities such as FastLoad, MultLoad and TPump for various ETL/ELT loads.
- Provide ETL Architecture, design and requirements specifications for multiple projects.
- Designed teh ETL code and load strategies to best utilize teh ETL tool (Informatica) functionality and teh Teradata loaders for ELT vs ETL.
- Prepared data platform for marketing data analytics using Omniture data on Hadoop environment
Technical Environment: Informatica PowerCenter, Teradata, Hadoop, HBase, Hive, Oracle, MS SQL Server, IDQ, SSIS, ERwin, Business Objects, Control-M, ERWin, XML, SQL, Shell Scripting and Linux.
Confidential
ETL Architect/Lead
Responsibilities:
- Design and implement ETL load strategies, Informatica mappings, database load scripts for Teradata (BTEQ, FastLoad, MLOAD, FastExport, TPT (Teradata Parallel Transporter)), Oracle(SQL, PL/SQL)
- Prepare ETL standards, best practices, re-useable templates, migration procedures, error handling process design and implementation.
- Design & develop ETL processes and scripts to extract, transform & load data across various systems such as SAP BW, SAP R/3 IDocs, DB2, Oracle, Teradata, SQL Server, flat files, XML etc…
- Created shell scripts for UNIX environment for execution of Teradata BTEQ scripts; designing, developing and testing Teradata BTEQ scripts for ETL development; usage of SQL and Teradata utilities to research and troubleshoot data issues.
Technical Environment: Teradata V2R6, Informatica PowerCenter 8.6, Oracle 10g/9.x, Business Objects, PoweConnect for SAP, SAP BW, TIDAL, Autosys, ER Win, XML, UNIX Shell Scripting