Lead Big Data And Cloud Architect Resume
SUMMARY:
- Proven 27 years of senior executive experience in Big data solution Architect, Data Science, Business Analytics, Database management, Data warehouse, Data Architect, Enterprise Architect, Program Management, IT Operation & Infrastructure, Project Management, Portfolio management, Contract Management, Information Architect, Business Process Modeling and Customer Relationship Management.
- As Lead Enterprise Architect provided Big Data Architecture and Cloud solutions to the Executive Management utilizing Hortonworks HDP 2.5, Cloudera CDH 5.5 and MapR 4.01
- As Cloud Architect delivered end to end solutions using Amazon AWS and Microsoft Azure for Compute, Networking, Storage, Application, Migration, Integration, Databases, Analytics and Big Data.
- Comprehensive functional understanding of Financials, Fraud detection, Marketing, and Network Analytics and apply them at large organizations that are global Fortune 500 companies. Ability to question the business process inefficiencies and provide corrective and improved measures.
- Led program management including project planning, prioritization, scheduling, prioritizing, work assignment, status reporting, identify & mitigate risks. Improved team productivity up to 15% by implementing project portfolio management to reduce multitasking
- Led IT Process integration initiatives by managing large IT control frameworks, IT process frameworks (i.e., ITIL), ISO 17799/27001 information security standard, Sarbanes - Oxley compliance and process engineering/process improvement. Managed Incident, Change Management processes.
- Solid experience in the current IT high end computing like Enterprise software, COTS, transactional, operational, and Analytical systems for better contrasting big data solutions
- Improved operational capabilities and customer satisfaction by simultaneously managing multi-million dollar federal projects and reduced cost by 30%
- Delivered staffing and consulting services to Federal and commercial clients in information technology, ensuring risks are managed well, business continuity is maintained, and financial service management are undisturbed.
- Provided solutions to build Meta data management in tracking and retaining business data that leverages and facilitates reuse of existing table definitions, Business rules and data lineage.
BIG DATA ECO SYSTEM SKILLS:
Hive, Hive: Tez, Impala, HBASE, Accumulo, Cassandra, Mongodb, Greenplum, Redshift. Amazon AWS, Microsoft Azure, Confidential Bluemix, Softlayer, Netezza Striper ( Confidential Pure Data System for Analytics)Cloudera CDH 5.8, Hortonworks HDP 2.5Hortonworks HDF 1.2.0.1MapR, Azure HDInsight Hadoop, MapReduce 2, Hue, Zeppelin, Oozie, Phoenix, DataTorrent, Splunk, Hunk, Pepperdata, platfora, Atigeo, Plantir, Bluedata, Fume, Pig, Navigator, Atlas, Cloudera Manager, Ambari, MapR control center, Slider, Sqoop2, Nifi, Falcon, Kafka, Flink, SparkR, Pyspark, Spark, Storm.
Azure/AWS: Informatica, Data Stage, Abinitio, Virtual machine, virtual networks, network interfaces, storage accounts, CDN endpoint, active directory, Microsoft Azure, Azure Redis Cache and HDInsight cluster ADM, BPM, EAI, SaaS, PasS, IaaS EC2, VPC,ELB, Cloudfront, S3, Glacier, messaging with SQS, CloudWatch. RDS, VPN, S3, Glacier, IoT, EMR, Kinesis, Direct connect, Route 53, Redshift, Data Pipeline
TECHNICAL SKILLS:
- SUN ULTRA ENTERPRISE 3000, AIX-4.X,5.2
- SOLARIS-2.x, LAN
- UNIX SHELL PROGRAMMING
- SPRC SUN4/ SUN ULTRA 2, NOVEL-ETWARE
- SOLARIS TM 8 1/01, Windows
- C/C++ PROGRAMMING
- SUPER MINI COMPUTER, Netezza Striper, Twinfin
- UNIX System 5 - v 3.2
- Basic, COBOL, Java
- VAX-3400, VAX-4200/ VAX-6510/6520
- VAX/VMS (VER 6.2)
- VAX-COBOL/ VAX-FORTRAN, Pig
- Oracle 9i, 10G, 11G, DEC-DBMS,MS SQL Server 200, 2005,2008, SharePoint, Netezza, Oracle Grid.
- DBARTISAN 8.1, SQL Developer, Toad, OID
- Oracle Forms/reports, OBIEE, PL/SQL, OID, SSO, FMW, ANALYST 7.2 Tools, SAS, Plan view, E- Business R12
- Oracle RAC, Data Guard, ASM, RMANOEM, Grid control,, ER-Studio
- BI Dash Board, BI Administration Tools, BI Answers
- Sybase 11.x,12.0, SQL Server, DB2
- Power Designer, Erwin
- MS-PROJECT, VISIO-5.0, VISIBLE, Quest
PROFESSIONAL EXPERIENCE:
Lead Big Data and Cloud Architect
Confidential
Responsibilities:
- Experience in writing proposal to win cloud based big data migration projects in federal sector.
- Provided solution to implement AWS Cloudera CDH cluster and Microsoft Azure HDInsight for several federal clients.
- Mentor team members to understand & Support Cloudera CDH and Hortonworks HDP projects.
- Provided efficient solution in implementing capacity scheduler in HDP Multi tenant Environment.
- Architect and review design, development processes involving analysis of high volume data from diverse big data sources using advanced mathematical, statistical, querying, and reporting methods.
- Interpret predictive analytics techniques and inductive statistical analysis to predict outcomes and behavior using Spark, SAS Visual Analytics.
- Interact with business partners to understand the big data analytical requirements and feasibility based on large data, metadata sources, and communicate insights, findings from analysis, experiments to consider emerging technologies to create Machine Learning advanced models.
- Responsibility includes participating in the architecture, design and implementation of large scale big data and NOSQL solutions.
- Implemented traditional hybrid based full-scale solutions that include data wrangling, ingestion, movement, storage, transformation, security, data management and analysis using big data technologies.
- As a Big data SME, identified the risks, issues and provided technical subject matter expertise in evaluating & choosing Cloudera, Hortonworks, MapR and in turn implemented Cloudera CDH solution.
Big data stacks/databases: HDFS, MapReduce2, Pig, Hive, Tez, Impala, Storm, Impala, Oozie, HBase, Ranger, Navigator, Knox, Atlas, Kudu, Nifi, Pyspark, SparkR, Scala, Sqoop, Kafka, Flume, Yarn, Spark Mllib, Mongodb and Cassandra, SAS Visual Analytics and SAS Studio
Confidential
Lead Big Data Architect
Responsibilities:
- Worked with Confidential business development team to present Big data solutions and best practices in the industry for Hadoop Sentry, Big data governance, Lineage, meta data management, Big data Backup & DR, Big data ETL, Hadoop Cluster, Hiveserver2, Impala & Spark usage, Big data Multi-tenancy, SAS Big data, and Spark Machine learning
- Lead Data Governance team to implement operational Lineage using Cloudera Navigator and Meta data Hub and strategy to address data concerns specific to confidentiality and PII.
- Recommended best practice in managing data quality processes for the Lake and develop a process for monitoring the data
- Provide solution architecture to bench Mark and fine tune Hadoop cluster for CPU, memory, I/O, Network & yarn configuration.
- Guide Data Architect to Generate Hadoop modeling for Hive database tables in Data Lake.
Cloud Solution Architect
Confidential
Responsibilities:
- Provided solution architecture to Deploy, Manage and Operate scalable, highly available, fault tolerant Virtual machine on Azure and AWS .
- Migrated on-premise big data cluster to Azure
- Evaluated Backup and Site Recovery (OMS) using azure.
- Migrated on-premise application to Amazon AWS and Implementation of complex data integration
- Provide solution to select, deploy and monitor the appropriate AWS service based on compute, data, or security requirements
- Estimating Azure, AWS usage costs and identifying operational cost control mechanisms
Big data stacks/Databases: HDFS, MapReduce2, Pig, Hive, Impala, Storm, Impala, Oozie, HBase, Navigator, Knox, Kudu, Pyspark, SparkR, Scala, Sqoop, Kafka, Flume, Yarn, Spark Mllib, Mongodb and Cassandra, AWS EC2, VPC, S3, Glacier, SQS, Cloudwatch, EBS, Glacier, RDS, Redshift, Azure VM & Azure Active Directory.
Lead Enterprise Big Data Architect
Confidential
Responsibilities:
- Developed a whitepaper for best practices to implement a Multi-tenant cluster to support several projects within the Data Lake.
- Led team to handle LDAP users and AD groups and setting up Kerberos for the Authentication and Role based Authorization in Hadoop cluster.
- Successfully implemented solution & strategy to cleansing data, data profiling, data lineage, de-normalization, and aggregation.
- Recommended solution to maintain business, technical, operational lineage and user access monitoring using Navigator.
- Evaluating cyber insider thread project in providing the real-time information defenders to identify, prioritize and respond to advanced security threats.
Environment: RHEL 6.x, Hortonworks, Cloudera, AD, LDAP, Ganglia, Nagios, Ambari, CDH Cloudera Manager.
Director of Enterprise Big Data Architect/ Data Scientist
Confidential
Responsibilities:
- Manage, Mentor and support Big Data Architects, Engineers and Data Scientists to build advanced analytical model, data discovery and data mining products.
- Working with team to prioritize the POC initiatives and provide Big data & Advanced Analytics demos to several clients.
- Provide solution to implement the big data cloud based projects, including workload management and support the different phases of the project.
- Conduct weekly meeting with international users and the international development, operational users and provide the vision, roadmap, product development support, operation support and provide solution to any risks arising.
- Provide solution to implement enterprise Data Warehouse operation, big data advanced predictive application development using Cloudera 5.0 & Hortonworks HDP 2.0.
- Recommended best practice solution in migrating few customers from Netezza & Greenplum database to HBase & Hive platform by saving support cost, hardware cost for the company.
- Using Big data Hadoop Data Lake to archive large volume of data and export & import larger volume of data using Sqoop & Spark there by saving hardware cost of Netezza.
- Developed documentation that details Hadoop cluster Standards, Database standards, procedures, policies and service level agreements between different clients.
- Experience in Big data Analytics, set frequent Mining and Association Rule Mining.
- Driving various teams in the delivery of technology roadmap related items by various teams, interface with Enterprise, and give presentations on upcoming initiatives.
- Design, investigation and implementation of public facing websites on Amazon Web Services (AWS)
- Implemented major migration of Netezza from Twinfin to Striper (24 disks to 240 disks).
Enterprise Architect & Data Scientist
Confidential
Responsibilities:
- Accelerate the delivery of existing services and analytical products to market while meeting company's risk and cost reduction objectives.
- Match up with business experts and architects to convert key strategic objectives into design and roadmap that is governable and actionable.
- Draw existing and future vision enterprise architecture, based on IT Roadmap and business strategies.
- Design and carry out proofs of concepts for vital enterprise capabilities, including building business case with benefits, costs, and efficiency measures.
- Analyze technology and business challenges, suggest solutions, and assess costs.
- Implement all phases of the TOGAF Architecture Development Method (ADM)
- Provide solution to create advanced model using Confidential SPSS & Experience in implementing Predictive model, Shrinkage Analytics (K-mean)
- Experience in encompass of variety of statistical technique from modeling, machine learning and data mining and analyze historical and current to make predictions about the future unknown events.
- Provide solution to Create Churn, MS Clustering, Linear regression, Native Bayes, Decision tree, logical regression modeling & scoring model.
Tools: HDFS, MapReduce, Yarn, Spark, Pig, Shark, Impala, Flume, Hive, HBase, Accumulo, Strom, Spark, Flume, Sqoop, Sentry, Oozie, Zookeeper., MLLIB, Mahout, Rest API,, Microstrategy, SVN, GIT, Mega, EA, Confluence, SharePoint, Sales point, Weblogic, Tomcat, OAM, AWS, SSO. Amazon EC2, S3, RDS, Redshift, ELB, EBS and Microsoft Azure. Confidential SPSS & R
Program Manager/Architect
Confidential
Responsibilities:
- Evaluated big data Hadoop solution for Federal client suing Cloudera CDH eco system.
- Provide solution to manage and review Hadoop log files & Involved in running Hadoop streaming jobs to process terabytes of text data, Load and transform large sets of structured, semi structured and unstructured data in development cluster
- Extensive experience in translating business requirements to functional specifications, technical specifications, with data model and data dictionary.
- Strong understanding of the principles of Data warehousing, Fact Tables, Dimension Tables, star and snowflake schema modeling
- Experience in defining the Federal client’s SharePoint architecture, documenting architecture, reviewing architecture and working on blueprints, road maps information technology customers
- Provide rough order of magnitude (ROM) estimates, which include Software, hardware, and licensing, labor costs. Planned Application, infrastructure and operations additions in large enterprise wide IT and Engineering data centers. Work with engineers on designing and deploying network element monitoring tools into field locations.
Project Manager
Confidential
Responsibilities:
- Successfully implemented and transitioned to operations multiple deliverables toward organizational security and program benefits
- Responsible for running complex programs, projects, which includes handling critical tasks that are involved in design, development, reengineering, production support services and knowledge of CA Clarity Agile.
- Exhibited leadership and strategic planning in successfully leading a broad range of project managers, resources, time, cost, and quality for projects critical to consistently achieving goals and milestones.
- Facilitated weekly and monthly program/project status and governance/PMR meetings, maintained weekly dashboard, and integrate best practice tool and process for tracking action items, issues, risks and decisions.
Specialties: Waterfall & Agile methodologies. ITIL & PBOK standards and CMMI level 3 application development standards complaint. Audit using HP Fortify 360 tool.
Technologies: Oracle, Oracle Business Intelligence, Hadoop, HBase, Hive, MS SQL Server, SharePoint, Mega Architect tool, MarkLogic, Confidential Netezza 6.0, Oracle Fusion Middleware, Weblogic, Node Manager, Java J2EE, Struts, OAM, OID, Oracle SSO, Team Foundation Server, C#, .NET, ASP, Microsoft Visual Studio and MOSS 2007 & 2010 server, SQL 2008.
Vice President
Confidential, Houston, Texas
Responsibilities:
- Maintained senior management and customer expectations by prioritizing project contents and meeting project milestones.
- Oversaw relationships with other departments and third party vendors to track progress and coordinate multi-departmental releases.
- Coordinated the setup and managed the maintenance of a comprehensive disaster recovery system and conducted annual DR drills for multiple critical database application & Manage database and application migration using agile methodologies.
- Managed the development of various critical brokers and trading application products.
- Established coding standards, standardized procedures for releases, production testing guidelines and enforced source code control.
- Developed robust Windows-based Executive Information Systems that provided key performance indicators to support critical decision-making.
Specialties: Waterfall & Agile methodologies, ITIL. Sun, J2EE, Weblogic, Linux, Oracle 11G RAC, ASM, Data Guard, Grid. Sybase ASE 15.7, Sybase Replication, MS SQL Server, DB2 UDB.
Sr. Database Project Manager, Database Management Lead
Confidential, Mclean, VA
Responsibilities:
- Extensive project implementation experience in the MF, SF, ITGC & LP and Ability to quickly adapt to different working environments and to understand complex technical concepts.
- Created and defined project inception phase gate documents (i.e. Vision and Scope (SOW) Document, Project Charter, High-level time lines and cost estimates, etc.) for project owner & stakeholder’s review and sign-off.
- Identified infrastructure issues in ITGC infrastructure project in the database management domain and other application domain and has been remediated
- Manage Information technology change control project (ITGC), Prepare project charter, conduct kick of meeting, Provide ITGC project Artifacts like proposed Architecture, Business Rule spreadsheet, Data profile questionnaire, prepare project management plan, plan the resources, management cost and determine the budget, send week status dash board report
- Managed and supported technical Sybase, Oracle, DB2 UDB, Sybase IQ & MS SQL Server database and Data warehouse operations.
- Supported many application development projects in Loan prospector, Single Family and Multifamily division and regularly take database backup and supported production database servers in Confidential
- Implementation and production support of Oracle High Availability solutions for databases including RAC, Data Guard implementation, RMAN, Grid Control, and migrating existing databases to RAC environment
Technologies: Oracle, MS SQL Server, Sybase ASE, Sybase Replication and DB2 UDB, Oracle E-Business R12, OBIEE, Hyperion, FTD, Waterfall & Agile methodologies, Followed ITIL & PBOK standards