Principal Architect Resume
5.00/5 (Submit Your Rating)
SUMMARY
- He is a senior Big Data Technical Architect and Solution Designer with expertise in cloud environment.
- During his 20' years of IT experience Confidential has successfully assumed roles in areas of technology leadership.
- He has a proven track record of designing, building and leading technology applications.
- He has around 4+ years of hands - on experience in Big Data space.
- He has designed and implemented Big Data solutions using the Hadoop Ecosystem and open source software with focus on governance and security
- His passion is to help enterprises with cost effective and high performing big data solutions on on-premise and on cloud. He is up to date on emerging trends in big data space and effectively utilizes it for the benefit of the projects.
TECHNICAL SKILLS
- Primary Technologies: Hadoop, Apache Spark, MapReduce, PIG, Hive, Sqoop & other tools in ecosystem, NoSQL (MongoDB, Hbase), Cloudera, Hortonworks, EMR, ETL, lambda architecture, Real time Analytic solution using Kafka & Spark Streaming, Amazon AWS, IoT data analytics
- Other Technologies: Java, J2EE, HTML, XML, Oracle, PL/SQL, Cloud Computing, Azure DL, Apache Storm, JIRA, Client - Server architectures, Web Technologies, Shell Scripts.
PROFESSIONAL EXPERIENCE
Confidential
Principal Architect
Responsibilities:
- Designed and implemented big data lake on cloud (IaaS and Managed service) and successfully achieved cost savings by using right tools, services and pricing models
- Developed reference architecture for hybrid model data lake and acquired buy-in from IT stakeholders
- Completed AS IS assessment to understand existing analytical landscape including tools & technologies.
- Evaluated business and analytical use cases and designed a platform to support use cases
- Analyzed the data sources, designed ingestion strategy, transformation and data quality rules. Designed complete data pipeline from raw to curated data required for data scientists and data analysts
- Built a real-time data pipeline utilizing Kafka. Experienced in collecting, aggregating, and moving large amounts of streaming data into HDFS using Flume, Kafka and Spark Streaming.
- Collaborated with enterprise architects, network and security architects, data scientists, data source owners, end users and business sponsors during the process of designing and implementing data lake architecture
- Performed cost benefit analysis for cloud vs on premise implementation and helped clients to start their cloud journey.
- Implemented data governance policies in data lake and defined strategy to capture metadata for lineage
- Developed Data Security strategy and implemented it upfront using tools such as Kerberos, Ranger, Sentry, IAM, Encryption and masking.
- Led Hadoop cluster planning, sizing exercise and implementation of production ready Hadoop clusters
- Identified and resolved performance issues in Hadoop jobs. Designed and educated development teams with best practices for performance optimizations
- Designed and implemented Data As A Service model to serve data to downstream consumers
- Defined logical data models for data lake persistent layers for Hive and NoSQL
- Led RFP processes, developed proposals and completed effort estimations
- Worked as a single point of contact for onsite and client teams for any technical/architectural discussion or decision.
- Built a high-performance team through hiring, mentoring and knowledge sharing to deliver end-to-end Big Data solutions
Confidential, NJ
Program Manager
Responsibilities:
- Planned and set up Big data practice, hired & trained resources, setup Hadoop infrastructure
- Led solution design, Cluster setup, Sizing and estimation of efforts. Set up governance and access policies. Prepared project proposals and presentations.
- Worked closely with data scientist to understand algorithms and implement them in production
- Designed data transformation and aggregation solutions using Mapreduce, PIG, Hive including loading and transforming large sets of Structured, Semi-Structured data
Confidential, NJ
Senior Project Manager
Responsibilities:
- Initiated and led big data initiative, hired and trained team members and supported pre-sales
- Successfully led and completed several projects including setting up Hadoop clusters for big data projects and integrating Force.com with existing IT solution, which helped customer to meet business demands
- Defined and identified the key technologies to handle very large unstructured data, such as web log data, text data, and applications log data. Led projects for transformation and aggregation on acquired data.
Confidential
Program Manager
Responsibilities:
- Supported full spectrum of duties for IT technology services and significantly participated in all process steps from pre-sales, tendering, transitioning to offshore, development, implementation, training and final sign-off from client. Owned and efficiently managed organizational initiatives such as hiring, QA, compliances.
- Successfully managed a large and critical program consisting of a multiple development, testing and support projects for multinationals such as BT and AT&T. Owned delivery including P&L of $5M, 100+ resources, deliverables, quality, customer communications, service compliance and achieved cost savings of 15% YOY
- Conducted and led Design, Code and Test reviews which helped in 27% defect reduction.
- Delivered enterprise level web applications (Java/J2EE/Oracle) & scaled up architecture using Hadoop.
