We provide IT Staff Augmentation Services!

Senior Snowflake Developer Resume

2.00/5 (Submit Your Rating)

Jersey City, NJ

SUMMARY

  • 9+ years of IT professional experience in Data warehousing ETL/ELT) technologies including requirements gathering, data analysis, design, development, system integration testing, deployments, and documentation.
  • Over 5 year of designing and implementing a fully operational production grade large scale data solution on Snowflake Data Warehouse
  • Solid experience and understanding of architecting, designing and operationalization of large scale data and analytics solutions on Snowflake Cloud Data Warehouse.
  • Experience withSnowflake Multi - Cluster Warehouses.
  • Experience withSnowflake Virtual Warehouses.
  • Experience working with onprem database to snowflake datawarehouse
  • In-depth knowledge ofData Sharingin Snowflake.
  • Data modeling forData Warehouse/Data Martdevelopment, Data Analysis for Online Transaction Processing
  • Involved in various projects related toData Modeling, System/Data Analysis, Design and Development for both
  • Understanding ofviews, Synonyms, Indexes, Joins and Partitioning.
  • Experience in extracting, transforming and loading (ETL) data from spreadsheets, database tables and other sources using Microsoft SSIS and Informatica.
  • Developed ETL pipelines in and out of data warehouse using the combination of Python and Snowflake’s SnowSQL.
  • Write highly tuned and performant SQLs on various DB platform including MPPs.
  • Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards, data types, volumetric, domain definitions, and corporate meta-data definitions.
  • Excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies.
  • Experience in building data warehousing and data marts with OLTP vs OLAP, star vs snowflake schema, normalization vs de-normalization methods.
  • Define virtual warehouse sizing for Snowflake for different type of workloads.
  • Design and code required Database structures and components.
  • Ensure in corporation of best practices and lessons learned from prior projects.
  • Coding for Stored Procedures/ Triggers.
  • Supported various reporting teams and experience with data visualization tool Tableau.
  • Very good at SQL, data analysis, unit testing, debugging data quality issues.
  • Azure Knowledge - Knowledge in various Azure PAAS offerings e.g.ADLS, Azure BLOB, Batch Service, Azure Data factory V1, Key Vaults etc.
  • Worked with data scientists to understand the requirements and propose and identify data sources and alternatives.
  • Understand data classification, and adhere to the information protection,Data Goveranance and privacy restrictions on data.

TECHNICAL SKILLS

Languages: Shell Scripting (K-Shell, C-Shell), PL/SQL, Control-M Scheduling, Autosys, Java script, HTML, Python

Databases: Teradata,Oracle, POSTGRES, MongoDB,Azure CosomsDB,SQL Server,My SQL

Agile Methodologies/Dev ops: Jenkins, GitHub,Jira,CI/CD using PowerShell, Confluence,Stash

Big Data/Hadoop/Cloud: HDFS, Hive, Sqoop, Pig, Kafka, Map Reduce, AZURE Data services, Snowflake Design Development

Web services: REST API, SOAP

ETL/Snowflake: TalendDataIntegration,Informaticasuite(Powercenter,IDQ,DIH,MDM,BDM)Webservice, SOAP

Cloud computing: Azure Data factory,Azure stream analytics,Azure Databricks,Azure Datalake storage

Reporting Tools: POWER BI, SAS and Tableau

PROFESSIONAL EXPERIENCE

Confidential, Jersey City, NJ

Senior Snowflake Developer

Responsibilities:

  • Gathered business requirement to determine the feasibility and to convert them to technical tasks in the design document.
  • Created Snow warehouse 1x/2x Large, Database, Snow pipes and other utilities.
  • Knowledge in multi cluster, auto scaling and auto suspend the warehouse.
  • Created worksheets using SQL to stage, transform data.
  • Developed scripts using PL/SQL.
  • Involved in building scalable distributed data lake system for Confidential real time and batch analytical needs.
  • Created or setup SQS notifications when Kinesis writes new data files to gets the data near real time, triggered Lambda script to automate the job to append data into target through Snow pipes.
  • Performed tuning of Applications for setting right Batch Interval time, correct level of Parallelism and memory tuning.
  • Cloned data bases, tables, schemas etc.
  • Worked on Create, Alter & update tables, views, file formats, data etc.
  • Worked with different file formats, like CSV, JSON, PARQUET etc.
  • Worked using multiple roles Account Admin, SYSADMIN etc.
  • Worked in determining various strategies related to data security.
  • Egress data to vendors for business application reporting
  • Data preparation & analysis using Alteryx connecting to In-DB functions (Hive, Impala).
  • Automated legacy manual workflows into Alteryx and reported using Power BI.

Environment: Snowflake, S3, AWS Redshift, Alteryx, Power BI

Confidential, Plano, TX

Snowflake Consultant

Responsibilities:

  • Informatica ETLs from on-premise platform to Azure based cloud platform
  • Designed the flow from existing onprem data to snowflake
  • Lift and shift, SQL code converted from Oracle to snowflake compatability
  • Snowsql configured in python for audit mechanism
  • Worked on Python based Auto ETL to load Historical and Incremental data from different source like Hyperion,Salesforce to S3, then S3 to snowflake.
  • Migrated and make it compatbale the Tableau extract from snowflake reporting tables
  • General ledger tables consolidation for snowflake to increase the perfromace and increase refresh time
  • Implemented Snowsql configuration with vault integration without password prompt.
  • Applied snowsql hints to increase the query performance.
  • Using Merge statement COPY data from External storage to Snowflake
  • Complex Informatica ETL migration to snowflake
  • Working in Snowflake Materized views to publish data to snowflake in like and extract mode.
  • Used FLATTEN table function to produce lateral view of VARIANT, OBJECT and ARRAY column.
  • Migrating the historical and incremental data from Oracle to Snowflake
  • Migration from Informatica onprem to informatica cloud using power exchange
  • Heavily using with Agile process management tools (e.g.Gus)
  • Familiarity with Middleware technologies and test automation (Selenium, SoapUI, etc.)
  • Analyze and manage 3rd Party and other external data sets

Environment: Snowflake, Azure, Python,Informatica, Tablueau,Airflow,Tidal.

Confidential, Cincinnati, OH

Snowflake Developer

Responsibilities:

  • Evaluate Snowflake Design considerations for any change in the application.
  • Build the Logical and Physical data model for snowflake as per the changes required.
  • Define roles, privileges required to access different database objects.
  • Define virtual warehouse sizing for Snowflake for different type of workloads.
  • Design and code required Database structures and components.
  • Published Power BI Reports in the required originations and Made Power BI Dashboards available in Web clients and mobile apps
  • Build the Logical and Physical data model for snowflake as per the changes required.
  • Worked with cloud architect to set up the environment.
  • Involved in Migrating Objects from Teradata to Snowflake.
  • Created Snow pipe for continuous data load.
  • Used COPY to bulk load the data.
  • Created internal and external stage and transformed data during load.
  • Used FLATTEN table function to produce lateral view of VARIENT, OBECT and ARRAY column.
  • Worked with both Maximized and Auto-scale functionality.
  • Used Temporary and Transient tables on diff datasets.
  • Cloned Production data for code modifications and testing.
  • Shared sample data using grant access to customer for UAT.
  • Time traveled to 56 days to recover missed data.
  • Developed data warehouse model in snowflake for over 100 datasets using were Scape.
  • Heavily involved in testing Snowflake to understand best possible way to use the cloud resources.
  • Developed ELT workflows using NiFI to load data into Hive and Teradata.
  • Worked on Migrating jobs from NiFi development to Pre-PROD and Production cluster.
  • Scheduled different Snowflake jobs using NiFi.
  • Used NiFi to ping snowflake to keep Client Session alive.
  • Worked on Oracle Databases, RedShift and Snowflakes
  • Define virtual warehouse sizing for Snowflake for different type of workloads.

Confidential, Denver, CO

Data Engineer

Responsibilities:

  • Gathered business requirement to determine the feasibility and to convert them to technical tasks in the design document.
  • Created Snow warehouse 1x/2x Large, Database, Snow pipes and other utilities.
  • Worked on multi cluster, auto scaling and auto suspend the warehouse.
  • Created worksheets using SQL to stage, transform data.
  • Cloned databases, tables, schemas etc.
  • Worked on Create, Alter & update tables, views, file formats, data etc.
  • Worked with different file formats, like CSV, JSON, PARQUET etc.
  • Worked using multiple roles Account Admin, SYSADMIN etc.
  • Worked on SnowSQL and Snowpipe
  • Converted Talend Joblets to support the snowflake functionality.
  • Created Snowpipe for continuous data load.
  • Develop stored procedures/ views in Snowflake and use in Talend for loading Dimensions and Facts.
  • Used COPY to bulk load the data.
  • Created data sharing between two snowflake accounts.
  • Created internal and external stage and transformed data during load.
  • Redesigned the Views in snowflake to increase the performance.
  • Used Talend big data components like Hadoop and S3 Buckets and AWS Services for redshift.
  • Validating the data from SQL Server to Snowflake to make sure it an accurate match.
  • Involved in the Requirements gathering to conduct the POC on Snowflake.
  • Building solutions once for all with no band-aid approach.
  • Implemented Change Data Capture technology in Talend to load deltas to a Data Warehouse.
  • Design, develop, test, implement and support of Data Warehousing ETL using Talend.
  • Oracle Stored Procedure code review and testing with plsqlut7 functionality.
  • Expertise in managing a robust environment across Windows and UNIX.
  • Scheduling and designing ControlM jobs, Batch Monroring.
  • Created Rich Graphic visualization/ dashboards to enable a fast read on claims and key business drivers and to direct attention to key area.
  • Created series of signboard and dashboard reports, built for a new process being built in Horizon.
  • Mastered the ability to design and deploy rich Graphic visualizations with Drill Down and Drop-down menu option and Parameterized using Tableau.
  • Created complex data Views Manually using multiple measures, also used sort, Filter, group, Create Set functionality.
  • Published Reports, workbooks & data source to server and exporting reports in different Formats.
  • Using URL’s, Hyperlinks, Filters for developing Complex dashboards.
  • Worked creating Aggregations, calculated Fields, table calculations, Totals, percentages using Key Performance Measures (KPI) and Measure.
  • Validated the reports before publishing reports to server.
  • Administered user, user groups, and scheduled instances for reports in Tableau.
  • Developed Custom SQL scripts to view required claim data in developed dashboards.
  • Optimized SQL scripts to improve the efficiency on the database.
  • Involved in performance tuning for tableau Dashboards.

Environment: Snowflake, Alteryx, Hive, Impala, informatica IDQ, Tableau, SQL, UNIX,Oracle 12c, Windows, Stash, Jenkins, GitHub, Control-M, Hadoop.

Confidential

DWH-BI Consultant

Responsibilities:

  • Responsible for the requirement gathering and design.
  • Worked as a MicroStrategy Architect responsible for Schema Design and Development using desktop.
  • Building Mappings, Mapplets, Workflows using Informatica 8.6.1.
  • Created data marts for reporting in Teradata.
  • Using Repository Manager to Export and Import XML from Development Folders to central and respectively to other environments.
  • Responsible for Building new reports, dashboards, reports troubleshooting, enhancements, and performance tuning MicroStrategy Intelligence Server for better performance.
  • Created and designed documents with different controls like HTML Containers, grid-graph, panel stack etc.
  • Used different formatting grouping techniques to support the end user requirements on documents.
  • Designed dynamic dash boards with more interactivity using components like selectors, widgets etc.
  • Created documents with links on attribute, metric, hierarchy, object prompt on a Grid Graph to enable another document.
  • Handled prompts answer in target documents in different ways like using existing prompt answers from the source, using the objects selected in the source etc.

Environment: MicroStrategy 9.0.1, Informatica 8.6.1, Teradata v13, Power Shell, IBM Data Architect

Confidential

Informatica Developer

Responsibilities:

  • Reviewed ETL development, work closely and drive quality of implementation - ensured unit testing is being completed and quality audits are being performed on the ETL work.
  • Designed the ETL specification documents to gather workflows information from offshore and shared with Integration and production maintenance team.
  • Created the ETL technical specification for the effort based on business requirements. The source system being mainly Oracle Master Data.
  • Worked closely with Business Analyst and Shared Services Technical Leads in defining technical specifications and designs for Oracle based large data warehouse environment.
  • Developed detailed ETL implementation design based on technical specification for BI effort within the ETL design standards and guidelines.
  • Control-M job scheduling,Design and monitor the jobs in Planning.
  • Created unit testing scripts and representative unit testing data based on business requirements.
  • Ensured testing data is available for unit and acceptance testing within development and QA environments.
  • Unit tested ETL code/scripts to ensure correctness of functionality and compliance with business requirements.
  • Refined ETL code/scripts as needed to enable migration of new code from development to QA and QA to production environments following the standard migration signoff procedure.
  • Worked with the team Manager and Business Analysts of the team to understand prioritization of assigned trouble tickets and to understand business user needs/requirements driving the support requests.
  • Performed problem assessment, resolution and documentation, and, performance and quality validation on new and existing BI environments.

Environment: Informatica power center 9.X, Toad 9.1,SQL Developer, UNIX,Oracle 11g, SQL Server, Erwin data model, Windows.

We'd love your feedback!