We provide IT Staff Augmentation Services!

Big Data - Hadoop /data Integration Architect Resume

4.00/5 (Submit Your Rating)

Atlanta, GA

SUMMARY

  • 15 years of IT experience in Solution Design, Development and Implementation of various large data intense projects.
  • 6 years of development experience with Big Data tools like Hadoop, Hive, HBase, Pig, Sqoop, Nifi, Spark, Kafka, and Oozie.
  • 10 plus years of Solution Design experience in Enterprise Data Warehouse.
  • 2 years of experience in Machine Learning, Deep Learning and Artificial Intelligence.
  • 3 years of experience in SAS and R analytical modeling.

TECHNICAL SKILLS

Programming languages and mathematical packages: Matlab, Python,Scala,R,PLSQL,UNIX Shell & Perl Scripting, Java

Big Data/HDFS: Hadoop, Hive, HBase, Pig, SQOOP, KAFKA, SPARK, NIFI, PHOENIX, Oozie,Ambari

DB/ETL /BI: Oracle Exadata/l2c,11g,IBM DB2,Informatica, OBIEE, Tableau

Other Tools: Autosys, RapidSQL, WinSCP, Avaloq, Axiom, TLM,SQL* Loader, Clear Case, Subversion

PROFESSIONAL EXPERIENCE

Big Data - Hadoop /Data Integration Architect

Confidential, Atlanta, GA

Responsibilities:

  • Customer Impact Table (CIT) - Real-time outage and Impact Analysis
  • Field Network Technicians Activity API -Real-time reporting and analytics.
  • Lead management for Marketing and Sales - Digital Interaction and several other data source integrations.
  • Data ingestion from Amazon S3 to HDFS for various reporting and LEAD creation
  • Gigablast and DOCSIS 3.1 - Equipment polling and performance data management
  • Data ingestion from scheduled polling devices WEBNMS,EDGEHEALTH, ICOMS,Granite,GNIS
  • Sales Chat data and opportunity analysis-LivePerson Chat log analysis
  • Video Data and clickstream Analytics for real-time recommendation engine management
  • HBase implementation as Data Services from HDFS- CCM Middleware Integration
  • Enterprise Core Data Sets- Data Ingestion for Minerva- HAWQ DB. SQOOP Framework for multi-DB ingestion
  • Fraud Detection - Data Science project
  • Subscriber Journey Mart
  • BCP (business continuity planning) event driven data processing and real-time service impacting events.

Big Data - Hadoop /Data Integration Architect

Confidential

Responsibilities:

  • Solution Architecture of EDS systems and provide high-level design and work with EAs(Enterprise Architects).
  • Lead the Data Integration & Platforms function that entails leadership responsibility for envisioning, planning, build & deployment of enterprise data and analytics capabilities.
  • Work as a Lead architect for various Bigdata and traditional Data integration projects.
  • Design high level data flow and choose right big data tool/combination of tools for each projects
  • Design, Develop and modify Hadoop, sqoop, spark,hive, HBase codes and Informatica mappings and workflows.
  • Do extensive research on Bigdata tool selection based on individual use case.
  • Do Proof Of Concept on new initiatives and implement discovery zone activities.
  • Think big when building any solution -Scalable, re-usable and operation friendly products.
  • Provide Estimate to the End to End project manager and Provide HLD and LLD for the project.
  • Lead and manage vendor resources.
  • Conduct Design and Code review meetings.
  • Work with operations /Infrastructure teams to make sure code is capable of running without manual intervention
  • Act as the Data SME and provide end-to-end technical guidance for data flow into HDFS(DL), ODS and EDW.
  • Provide technical guidance to project teams in the development of project prototypes/project scoping efforts.
  • Assisting in defining data quality metrics, standards and guidelines of using data
  • Drafting and communicating data integration strategies and visions.
  • Assuring that sensitive data, regardless of format, is protected at all times by only using approved equipment, networks and other controls.Work with Enterprise Security team to vet process in transit and rest to ensure ultimate security.
  • Ensuring that standard project methodology is followed and that policies, procedures and metrics are in place for maintaining/improving data quality and the creation, capture and maintenance of metadata
  • Ensuring that all strategic data is modeled, named and defined consistently.
  • Supporting/sharing knowledge with other architects
  • Communicating new and changed business requirements, issues and problems to individuals who may be impacted
  • Communicating concerns, issues and problems with data to the individuals who can influence change
  • Evaluate and design logical and physical databases; define logical views and physical data structures.
  • Validates test plans and test scenarios; verify test results. Help QA teams to understand business/technical requirements.

Environment: Hortonworks HDP, HDFS, HDF/Nifi Hadoop Informatica 9.1, Oracle Exadata,SAS and Splunk .

Confidential, Stamford CT

Data Architect/ETL Specialist /Sr. ETL -BI Consultant

Responsibilities:

  • Work closely with Business Users and Business Analysts to gather functional and technical requirements.
  • Design and Implement ETL solutions for complex functional and technical requirements using Informatica PowerCenter 8.6.
  • Involve in Logical data model design and work closely with physical implementations by architecture team.
  • Closely work on the relational, dimensional modeling aspects of the DW design and participate in brainstorm sessions.
  • Implement Error Handling, Exception handling and logging mechanism and follow Standards and Best practices in coding.
  • Implement Data Archiving, Data Reconciliation using Informatica/SQL/Unix components.
  • Create shell scripts for various requirements. Defining the new process for the application
  • Write complex stored procedures and triggers and optimized them for maximum performance.
  • Utilize Oracle Utilities like Imports / Exports, SQL * Loader for large scale data loads
  • Assign code review and participate in peer review of several ETL components.
  • Develop data migration strategies and processes for production deployment.
  • Taking care of PROD issues and release issues
  • Support production support team and Operations team in their day-to-day needs.
  • Problem and Incident Management - Responding to production incidents, tracking incidents to resolution and closure
  • Coordinate on cross-team requirements between up-stream / down-stream / Application Architects & Enterprise teams.
  • Informatica administration and management.

Environment: Informatica 8.6, Oracle 11 G, DAC,Autosys, SEIBEL

Confidential

Data Architect/ETL Specialist /Sr. ETL -BI Consultant

Responsibilities:

  • ETL Development using Informatica Power center and Oracle database
  • Data Analysis and Test Data Preparation and Unit testing
  • Production Support and Post Implementation enhancement development
  • Performance Enhancements and Implementation Documentation

We'd love your feedback!