- He has 6 years of IT experience as a Data Analyst, Scrum Master, Business Analyst with high proficiency in experience in requirements Analysis, Design Specification and Testing as per Cycle in both Waterfall and Agile methodologies and Machine Learning.
- Experience in Software Development Life Cycle (SDLC) including Requirements Analysis, Design Specification and Testing as per Cycle in both Waterfall and Agile methodologies.
- Experience in BIG Data ECOSYSTEM using HADOOP framework and related technologies such as HDFS, MapReduce, HIVE, HBASE, SQOOP including working proficiency in Spark Core, Spark SQL, Spark Streaming
- Excellent knowledge of Hadoop architecture and complete understanding of Hadoop daemons and various components such as HDFS, YARN, Resource Manager, Node Manager, Name Node, Data Node and Map Reduce programming paradigm.
- Knowledge in Cloud services such as Microsoft Azure and Amazon AWS and Worked on both the latest Cloudera and Horton works Hadoop Distributions.
- Experience with complex Data processing pipelines, including ETL and Data ingestion dealing with unstructured and semi - structured Data.
- Hands on experience in loading unstructured Data (Log files, Xml Data, JSON) into HDFS and worked on Import & Export of Data using ETL tool Sqoop from SQL Server, Oracle to HDFS.
- Hands on experience in Spark and Spark Streaming creating RDD's, applying operations like Transformation and Actions on it.
- Experience in writing complex SQL Queries involving multiple tables inner and outer joins to create views, indexes, stored procedures and functions
- Good understanding of creating Conceptual Data Models, Process/ Data Flow Diagrams, Use Case Diagrams, Class Diagrams and State Diagrams.
- Proficient in SQL queries which includes fixing coding errors and other data-related problems.
Big Data/Hadoop: Spark Core, Spark SQL, Hive, Sqoop, MySQL
Operating Systems: UNIX, Windows NT, Windows 95/98/2000/XP/7, DOS
Programming Languages: SQL, Python, HIVEQL, C#
Databases: Oracle 10g, SQL Server, MS Access, Hadoop bigdata
Testing Tools: HP-Quality center, HP-ALM, Web Tracker, Selenium Web driver, TestNg, and SOAP UI.
Processes: SDLC, Agile, TDD
Defect Tracking: Quality Center, Rational Clear Quest, Bug Tracker, Jira
Protocol: FTP, TCP/IP, HTTP
SDLC Methodologies: Agile, RAD, RUP, UML, Waterfall Model
Confidential, Minneapolis, MN
- Designed, created, and implemented database applications based on business requirements.
- Created complex queries to perform analysis on Key Performance indicators that determined system and operational health.
- Performed archiving of legacy data into a remote database that increased storage space by nearly 20% thereby improving performance as well
- Oversee and direct efforts to identify information and technology solutions that enable business needs and strategies.
- Own and define DevOps pipelines and release management for data engineering
- Apply business knowledge and experience to effectively advise others on technology as an enabler.
- Act as a change agent to continuously improve and move the organization forward.
- Accountable to successfully deliver the right results on initiatives in a timely and effective manner.
- Direct the work of others to lead initiatives that cross multiple assets, technologies, platforms, departments and vendors.
- Proactively mitigate risks across multiple assets, information domains, technologies and platforms.
- Performed code debugging and created exceptions to handle bad data and created error log files.
Confidential, Stamford, CT
Associate Data Engineer
- Design, code, test, debug and document complex data base queries.
- Troubleshoot and resolve complex production issues.
- Design relational database models for small and large applications.
- Provide data analysis and validation.
- Work with teams for identification of performance issues in production.
- PL/SQL and SQL Tuning and Optimizations of newly develop and existing applications.
- Work effectively with the Infrastructure architects and DBA teams to ensure that all approved development and deployment procedures are followed.
- Work with internal and external users to understand the requirements, explain issues and prioritize work
- Follow and adopt SDLC process
- Utilized various data analysis and data visualization tools to accomplish data analysis, report design and report delivery.
- Create statistical models based on researched information to provide conclusions that will guide the company and the industry into the future.
- Worked with VBA and Excel to track various data modules.
- Maintain, execute, and create SQL queries to support critical business needs
- Created visually impactful dashboards in Tableau for data reporting using pivot tables and VLOOKUP. Tracking of coding issues, process bugs using JIRA and Coordinating production releases and release status.
- Built SQL queries, triggers, views to enhance the performance of the database procedures.
- Created and automated various SSRS reports and delivered user centric solutions using SSIS.
- Meet regularly with Leadership to convey results and gather requirements
- Assistance in designing and implementing large-scale digital maps using Tableau dashboards.
- Work independently or collaboratively as required by project work