- 12 Years of experience in Snowflake, Oracle Advance SQL, PL/SQL, Forms/Reports, Hadoop, Spark and Scala, Sqoop, Talend, Kafka, Python, R, Go, Tableau and AWS
- I am passionate to coding, self - learner, self-motivational and team player with dynamic roles.
- Working with Cisco’s Data and Analytics team for data quality project for ETL development using Snowflake Cloud data warehouse hosted on AWS, Python, Talend, Sqoop and Kafka data streaming and ingestion tools
- Working on data analysis in Snowflake and data migration from various sources like flat file, Hana, Teradata, Oracle, MongoDB, Hive and Kafka.
- Working on Application design, application migration, data modeling, data ingestion, and ETL process and visual dashboard/reports for data quality application.
- Working on data quality and data processing using various Python libraries like pandas, JSON, numpy, matplotlib, request, re, argparse, threading, email, smtplib, Kafka-python, Snowflake-connector, cx Oracle, pyhdb, openpyxl, and many more.
- I have knowledge on AWS developer services like IAM, S3, EC2, Dynamo DB Elastic Beanstalk and other monitoring services.
- Having basic knowledge of R and Go language.
- I am having fundamental and basic coding knowledge on data Science algorithms.
- Having deep understanding of Snowflake micro partitioning, time travel, snow SQL, flat file processing and data processing using Python connectors.
- Experience in the areas of Software Development in advance SQL, PL/SQL like (Procedures, Functions, Packages, Triggers, Views, Materialized views, Indexes, table partitions, collections, SQL loader, bulk data operations etc.) in oracle 12c, 11g, 10g and 9i.
- Technical proficiency in various software development languages, applications and databases and possess knowledge of advance PL/SQL, Performance tuning with TKPROF, Profiler, Tuning Advisor, In-memory processing and threading.
- Experience in application migration from Oracle 10g to 12c, Oracle/Hana to Snowflake and Oracle custom ERP version upgrades.
- Good experience in bulk data processing for data quality, Application and data migration, about 8 TB data migrated in less time using data block level scanning.
- Have worked on Spark framework like Spark RDD operations, Spark SQL and Spark Streaming etc.
- Having knowledge of UNIX and Big Data HADOOP framework like Map reduce, Hive and Sqoop.
- Having knowledge of data visualization tool - Tableau.
- Hands on experience in software development with AGILE methodology.
- Possess experience in all phases of software development from requirement analysis to system study, designing, de-bugging, implementation, and testing.
- Excellent communication & interpersonal and analytical skills with abilities in resolving complex business requirement and software issues.
- Hands on experience in Building and customize interfaces with Inventory, Purchase, Sales, Finance, Production and Planning Modules in custom ERP.
- Experience in challenging requirement of BOM, WIP, General ledgers, Trial balance, Balance Sheet, bulk data operations and data migration.
- Working as Onsite coordinator to gather customer requirements, project estimation and solution design, development and delivery.
- Worked in under pressure and short notice code delivery for urgent production deployments.
Languages: Python, PL/SQL
DBMS/RDBMS: Snowflake cloud data warehouse, Oracle 12c/11g/10g/9i/8i
Development Tools: PL/SQL, SQL Developer, HDFS, Pig, Hive, Scoop, Spark and Scala, Oracle
Operating System: Windows, UNIX, Linux.
Other Tools: ADO, Rally, Control-M, Tidal, Remedy, HP Quality Center, Kintana, Artifactory and VSS.
Technology: Snowflake, Python, Tableau, Talend, Kafka, Advance PL/SQL with Oracle 12c, MongoDB, Hana, Teradata, Sqoop, Spark, Hive, Unix/Linux.
- Working on Snowflake cloud data warehouse using Python/Java Procedures and tableau for data analytics and data quality project.
- Working on application data model design, development, and application performance improvement techniques.
- Data migration and data ingestion in snowflake using Python/Talend/ tools from various sources like Flat file, Oracle, MongoDB, Hive, Hana, and Teradata.
- Working on data quality issues and data analysis for data issues and Cisco’s revenue loss.
- Working on ETL development in Snowflake.
- Working on data quality dashboards using Tableau to measure the data quality trend.
- Developed code less framework for ETL data pipeline using Python.
- Working on data pipeline setup using Kafka to process millions of records in a day.
- Worked on MongoDB No-SQL database for data analysis.
- Worked with Install base and Service contracts project for requirement analysis, design and development and unit testing of backend APIs using Oracle Pl/SQL.
- Worked on performance tuning of the oracle packages/procedures.
- Worked on design, development and testing of software life cycle using AGILE methodology.
- Worked on application upgrade and migration.
- Developed automated code migration and code validation utility for team so that they can deploy and validate their code with few clicks.
- Worked at onsite (USA) location as customer coordinator to gather and analyze customer requirement, cross team co-ordination project estimation, issues tracking and implementation.
Development and Support
Technology: Oracle PL/SQL, UNIX, Forms and Reports 10g, Oracle Bi.
- Worked on custom ERP modules like grading, purchase, sales, and inventory in diamond industry.
- Played key role in a team to resolve application locking issues.
- Developed timesheet entry application, user rights and data security.
- Participated on code review with team members and helped to improve query performance and tuning.
- Helped other team members for coding and application issues solution.