Senior Solutions Architect Resume
- Solution oriented engineer wif excellent analytical skills. Experienced in Big Data solutions, Analytics and NoSQL platforms. TEMPHas over 20+ years of experience in Data warehousing, Business Intelligence and ELT/ETL. Possess strong leadership and program management skills.
- Expert in translating business requirements to optimal solutions.
- Exposed to Big Data, MPP technologies, Cloud, Web and various other technologies.
Database: Greenplum MPP, Impala, Hive, Couchbase, Oracle, Netezza and various other DB’s. Expert in SQL, PL/SQL, PL/pgSQL, Performance tuning, etc.
Big Data: HDFS, Impala, Hive, Greenplum to/from HDFS custom backup and restore, Hadoop eco system familiarity.
ELT/ETL: Informatica as well as custom data load jobs using SQL, PL/SQL, PLPGSQL, Perl and Unix shell scripts.
BI/Reporting tools: OBIEE, Business Objects, Birst, Oracle Application Express (Apex). Multiple programming and scripting languages including sh, Perl, java and Python
Oracle Database 10g: SQL and PL/SQL New Features Ed 1by Oracle University
Web related technologies: Preferred Locations SanFrancisco Bay Area/ Nashvillee, TN/Remote/ Travel/ Around Bangalore, India. Work Permit US Citizen, Overseas Citizen of India (OCI).
Senior Solutions Architect
- Level 3 support for Greenplum MPP. Help Infrastructure support team (Level 1) and Applications support team (Level 2) when issues bubble up.
- Support SQL on Cloudera Hadoop (Impala and Hive) specifically for performance related issues. Command Line and GUI Interface for Impala, Hive and other tools.
- Support LDAP authentication to greenplum. Used kereberos credentials wif kinit for Hadoop products.
- Custom script based Greenplum Database backup/restore directly to/from Hadoop as DB server often did not has space for backup.
- Real Estate application ETL design, develop and support. Jobs scheduled via control - M.
- Support custom user on boarding to Greenplum application.
- Supported Audit team to establish guideline and implement them.
- Supported different Lines of business wif their Greenplum design, development, usage and best practices.
- DBA and script to back up and restore on DR server over WAN.
- ETL/SQL Code generator based on parameters stored in table.
Senior Solutions Architect
- Consulting in teh field of Big Data, Analytics, NoSQL, Data warehousing, MPP, Data Integration/ETL/ELT, Business Intelligence and related technologies.
- Installing Software only Greenplum into a cluster of virtual machines running CentOS 5.7 in different configurations of single node, f2 segment nodes and 4 segments.
- Ensuring readiness at Linux level for teh Greenplum install
- Installed Greenplum performance monitor, connectivity (ODBC and JDBC), Loader (gpload,gpfdist) and client (psql) packages
- Shell and SQL scripts to run data loads and queries to capture performance metrics
- Migrated tables and data from Oracle 10.2 to Greenplum
- OBIEE connectivity and Informatica 9.x Connector were tested
- Used Birst for Data Modeling, ETL, Dashboard and other reporting. Birst is a SaaS based BI solution delivered on teh cloud.
- Created new dashboards and moved all KPI reports from Excel to Birst.
- Reconcile differences between Accounting numbers and BI numbers.
- Production support/Troubleshooting of Birst issues
- Adhoc requests from users were catered to.
- Reverse Engineered Data Model from SQL code as Keys were not present in DB and engineered into Wherescape (Greenplum does not enforce keys and hence teh keys had not been created). DDL’s were generated, modified and applied to MySQL DB. MySQL Workbench was used to generate teh ER Models
- Project involved migrating DB from DB2 to Greenplum.
- Teh Data warehouse in DB2 z/os was to be replicated in Greenplum appliance.
- Involved Perl scripting to convert EBCDIC fixed width data to ASCII delimited including uncompressing packed decimal.
- Change Data Capture (CDC) implemented. INSERT, UPDATE, DELETE records were captured by CA Log Analyzer on DB2 for select large tables.
- EBCDIC to ASCII conversion was done and latest version of teh record for a given key was applied.
- Wrote scripts to generate scripts using arguments and metadata or DDL in file.
- Shell scripts and functions using PL/pgSQL were developed. Teh PL/pgSQL functions took schema name and table name as arguments using which SQLs were generated using its metadata of columns and data types.
- External tables and gpfdist was used for fast parallel loading.
Confidential, Concord, CA
- Transitioned to teh nascent Canada track in Aug 2010 as teh Data Warehouse (DWH) & Business Intelligence (BI) track technical lead (Architect) when teh requirements were being gathered and teh project was all in red. By Dec 2010, Development of 45 OBIEE reports of varying complexity accessing oracle DB was delivered to QA wif development completing on time. Informatica was used to integrate data from various sources into teh DWH in Oracle.GP Command Center integration
- Guide teh System Integration partners in Design, Development and perform code reviews. Implementation and production support on an ongoing basis was provided after project went live in May 2011. Production OBIEE servers were maintained by a central team hosting merged RPD’s and Dashboards of multiple tenants. Had to integrate wif teh processes and timelines to move wat was developed in local dev and QA’ed locally to central qa and subsequently production environment.
- Work wif several teams from Engineering tracks like Yantra Order Management fulfillment system, ATG eStore order placement systems, ATG CSC customer management system, Fatwire Content management system and external data integrated via a combination of Informatica and TIBCO for source data.
- Worked wif Product team for requirements, multiple Project teams for meeting plan dates and inter team coordination and System, Security, Networking teams for Infrastructure related issues.
- 5 years of experience meeting teh Data warehousing and BI requirements of Confidential US track.
- Lead teh Market Place initiative from DWH & BI track apart from several other key projects.
- Train, Guide and monitor work of System Integrator partner.
- Extensive direct interaction wif business users to ascertain requirements, suggest options over and above requested. Also interacted wif product managers (Business Analyst).
- DWH was originally on oracle wif custom PL/SQL and shell scripts for ETL. Evaluated options of continuing wif Oracle versus migrating to Teradata or Netezza. Netezza was chosen as teh DWH DB and ETL tool Informatica was brought in.
- .NET Developers/Architects Resumes
- Java Developers/Architects Resumes
- Informatica Developers/Architects Resumes
- Business Analyst (BA) Resumes
- Quality Assurance (QA) Resumes
- Network and Systems Administrators Resumes
- Help Desk and Support specialists Resumes
- Oracle Developers Resumes
- SAP Resumes
- Web Developer Resumes
- Datawarehousing, ETL, Informatica Resumes
- Business Intelligence, Business Object Resumes
- MainFrame Resumes
- Network Admin Resumes
- Oracle Resumes
- ORACLE DBA Resumes
- Other Resumes
- Peoplesoft Resumes
- Project Manager Resumes
- Quality Assurance Resumes
- Recruiter Resumes
- SAS Resumes
- Sharepoint Resumes
- SQL Developers Resumes
- Technical Writers Resumes
- WebSphere Resumes
- Hot Resumes