We provide IT Staff Augmentation Services!

Technology Evangelist And Consultant Resume

SUMMARY

  • Passionate about Information Technology with extensive experience in building scalable data driven Enterprise applications leveraging vast array of technologies including Social, Mobility, Analytics as well as Cloud and actively pursuing opportunities in the areas of emerging technologies
  • 14 years of programming, managerial and architecture experience using vast array of technologies and frameworks in various domains and in following roles
  • Technology Evangelist/Big Data and Cloud Consultant
  • Enterprise Architect/Practice Consultant
  • Sr. Database Consultant/Data Solutions Architect/Enterprise Architect/Technical Manager
  • Sr. Software Engineer/Database Lead
  • Professional Mentor in vast array of technologies. Working on a free online self - paced video based IT university www. Confidential .com to guide and mentor people in latest Technologies
  • SMAC - Social, Mobility, Analytics, Cloud and Virtualization is current focus area
  • Masters in Computer Applications (MCA) from Nagarjuna University, India
  • Expert in Big data ecosystem - HDFS, HBase, MapReduce, Pig, Hive, NoSQL, MongoDB
  • Cloudera Certified Developer as well as Administrator for Apache Hadoop
  • Certified in MongoDB as Developer (Python and Java) as well as MongoDB as DBA
  • Oracle Certified Associate - PL/SQL Developer, DBA, Glodengate
  • Experience in building and managing small and medium sized talent pools up to 20
  • Expert in Oracle VM Technology - Set up Oracle VMs on Sun Servers and Oracle ZFS
  • Extensive noledge in Amazon Web Services (AWS) - S3, EC2, EMR, Redshift, DynamoDB
  • Erudite in Cloud and Virtualization Technologies - AWS, Confidential Softlayer, Oracle Cloud, Open Stack, Cloud Foundry etc.
  • Knowledge about Oracle Fusion Middleware Architecture - ODI, Glodengate, OBIEE, SOA Suite etc
  • Certified Training of Data Warehousing tools like Informatica, Cognos and Gloden Gate
  • Expert in SQL, PL/SQL, Java/J2EE, Oracle Architecture and Performance tuning
  • Mentor in SQL, PLSQL and Performance Tuning as well as Glodengate
  • Data Integration with Oracle Directory LDAP Server and Oracle Coherence in memory data grid.
  • Experienced with Java/J2EE containers - Oracle Weblogic, Confidential Websphere
  • Erudite in Data modeling in both OLTP and OLAP environments
  • Erudite in Data Warehouse concepts and architecture
  • Strong noledge and experienced in Perl/Unix shell scripting
  • Involved in design and development of database driven systems ranging from 50 GB to several PB of data

TECHNICAL SKILLS

Big Data Eco System: Hadoop, Spark and the entire eco system

Cloud and Infrastructure: AWS, Microsoft Azure, Confidential Bluemix, Cloud Foundry

Database: Oracle, SQL Server, Informix, Sybase, HBase, MongoDB

DW/DI Tools: Glodengate, Informatica, Cognos

Application Development: Java, J2EE, Spring, MEAN stack etc

PROFESSIONAL EXPERIENCE

Confidential

Technology Evangelist and Consultant

Responsibilities:

  • Consulting and advisory in both infrastructure as well as delivery in emerging technologies such as Big Data and Cloud
  • Developed subscriber based Big Data labs which is accessible from any where - labs. Confidential .com
  • Developing high quality content on emerging technologies such as Big Data, Oracle, Glodengate, Cloud etc for my channel www.YouTube.com/itversityin to provide role based training for professionals Confidential all experience levels
  • Delivered training in almost all the Big Data technologies such as Hadoop, Map Reduce, Spark, Hive, Pig, Sqoop, Oozie, HBase, MongoDB, Kafka, Flume etc
  • Implemented proof of concepts to onboard new technologies Confidential several clients
  • Delivered training on almost all models of Core Spark (transformations and actions), Spark SQL, Data frames, Spark Streaming etc using both Scala and Python
  • Developing a web site www. Confidential .com as a case study for SMAC. It will be a responsive website using latest js frameworks and twitter bootstrap for UI,, node.js for web server, MongoDB for database, Hadoop for analytics etc. www. Confidential .com is one shop stop online IT education platform integrated with my YouTube channel “ Confidential ”. It will eventually have free content covering vast array of Technologies including SMAC.
  • Developing a data migration product for cross platform migration of large multi-terabyte Oracle databases with minimal down time.

Confidential

Enterprise Architect/Practice Consultant

Responsibilities:

  • Analyze the current big data capabilities of the organization and define strategy to build reliable big data and analytics practice for the company.
  • Implemented several big data related POCs on Azure and AWS leveraging pay as you go model.
  • Successfully executed a POC to migrate analytics product of the client away from EMR and make the ETL process cloud agnostic.
  • Working on POC to migrate reporting database of the client’s analytics product from AWS Redshift to Apache Spark
  • Evaluate HAWQ to check the feasibility of migrating client away from Redshift to either HAWQ or Spark in Big Data eco system.

Confidential

Enterprise Architect

Responsibilities:

  • Confidential ExperienceOne provides customer engagement solutions to clients - It includes strategic initiatives like Coremetrics, Unica, Tealeaf, Silverpop etc.
  • Implemented POCs using different distributions of Hadoop (Cloudera, Hortonworks, Pivotal HD, MapR etc)
  • Configure Security as well as High Availability for Hadoop Clusters.
  • Lead teams up to 10 for different projects using technologies like Oracle, Hadoop, HBase, Java, Puppet etc.
  • Redesign disparate platforms of all the products of Confidential ExperienceOne Digital Analytics to Big Data eco system
  • Design and implement near real time data integration between MQ and Hadoop.
  • Integration of Coremetrics and Unica to provide additional features to the clients.
  • Develop POCs using Hadoop and HBase to replace conventional platforms such as Oracle, Aster etc.
  • Develop ETL framework using Hadoop eco system such as Hive and Map Reduce to ingest data into Hadoop
  • Integrate dimensional data using HBase in real time
  • Performance tuning for operations in Oracle and Teradata Aster
  • Set up multiple Hadoop clusters on Confidential Softlayer (cloud) and integrate them with Puppet to automate the process
  • Set up 6 node Mongo DB cluster and evaluate the feasibility for storing dimension data
  • Migrate data from Oracle to Mongo DB and validate reports using bulk inserts
  • Develop data validation utility between Oracle, Aster and MongoDB
  • Design and develop data integration and validation tool using PL/SQL to upgrade existing databases from 10g to 11g which outperforms Oracle expdp and impdp in cross platform migrations.
  • Mentor in Big Data stack (Hadoop eco system, HBase, MongoDB etc) to vast array of audience.
  • Implement complex map reduce applications using all Hadoop Map Reduce features such as file formats, key and value types, partitioners, combiners, custom comparators etc.
  • Define framework and best practices for projects using different technologies. Perform periodic code reviews to ensure code is following the defined best practices.
  • Redesign data models from Oracle to TEMPeffectively fit into Hadoop eco system (Hive/HBase).

Confidential

Big Data Solutions Architect/Sr. Manager

Responsibilities:

  • Cloudera Certified Developer for Apache Hadoop
  • Cloudera Certified Administrator for Apache Hadoop
  • Cloudera Certified Specialist in Apache HBase
  • Compare and contrast different distributions of Hadoop such as Cloudera - CDH, Hortonworks HDP, Pivotal HD etc
  • Expertise in building numerous Hadoop Clusters using different flavors.
  • Certified in MongoDB as developer using Python and Java as well as DBA
  • Excellent noledge of HBase architecture and data modeling for NoSQL databases
  • Excellent noledge in Hadoop Distributed File System, Hadoop Map Reduce, HiveQL, Pig Latin
  • Proficient in migrating data into HDFS using Flume and Sqoop
  • Proficient in REST, JSON, Avro, Hadoop Streaming and Pipes etc.
  • Expertise in Big Data visualization tools such as Datameer, Tableau etc.
  • Hands on expertise in setting up Hadoop cluster on Amazon EC2, S3, EMR, Redshift etc
  • Migrate data warehouses from Oracle and SQL Server to Redshift
  • Design and implement Data Warehouse in Hadoop replacing DB2 ODS, Teradata DW, Datastage ETL leveraging Hadoop eco system.
  • Keep up with new vendors and tools emerging in Big Data Eco System
  • Mentor team members to set up Big Data Eco System in development/POC environments on their laptops using Linux on VMWare
  • Actively participated in interview process to screen/hire Sr. Associates, Architects, Consultants and Managers in Big Data Eco System
  • Build Cloudera Distribution of Hadoop eco system on 26 nodes with total 624 cores, 150 TB of Data Storage, 416 mappers, 208 reducers, 2 RACKs, 1.25 TB RAM etc.
  • Set up 24 node Mongo DB cluster - 3 Config Servers, 3 App Servers and 18 shards with 6 primary and 12 secondary clusters.
  • Lead the effort to migrate data from Oracle to Mongo DB using Java
  • Implemented read preferences and explored different levels of Write Concern
  • Extensively used Mongo DB aggregation framework to generate reports out of data
  • Actively involved in ETL process to get data in and out of Hadoop using Hive, Sqoop, Oozie and Java Map Reduce.
  • Performance tuning to TEMPeffectively write data into and read data from Hadoop using Hive, Sqoop and Java Map Reduce.
  • Design and implement highly efficient Map Reduce programs customizing Mapper, Reducer, Combiner, Comparator, Partitioner etc
  • Develop statistical methods using R and machine learning algorithms (supervised and unsupervised) using Java Map Reduce
  • Set up ETL between Hive and SQL Server using combination of Hive QL and Sqoop and Shell Scripting
  • Design and implement Hive tables (Partitioned and Non Partitioned) for Pharmacy Data Warehouse
  • Migrate historical data to Data Warhouse and plan for incremental day to day loads
  • Define Oozie work flows to kick off Hive and Sqoop jobs in orderly fashion
  • Solutions Architect to build operational data store to integrate data from store with the eCommerce platform.
  • Actively involved in building Cassandra data store integrating data in real time from Informix to Cassandra using Cassandra, Web Services, Tomcat etc.
  • Define key space and column families in Cassandra and provide interfaces to write to Cassandra Data Store
  • Develop interface for bulk load into Cassandra cluster primarily for initial load
  • Mentor and code review to ensure the quality of the code delivered for production deployments.

Confidential

Sr. Database Consultant

Responsibilities:

  • Extensively worked on design and development of multi terabyte data driven applications for many major clients using vast array of technologies.
  • Successfully implemented several POC in emerging technologies and created road map for clients to embrace them to solve their business problems

Hire Now