It Analyst Resume
5.00/5 (Submit Your Rating)
OBJECTIVE:
Intent to build a career with committed and dedicated people which will help me to discover myself as a key player in the challenging, innovative and flexible environment.
SUMMARY:
- Over 7+ years of experience in IT industry, which includes 5 years of hands on experience in Big data ecosystem related technologies.
- Hands on experience in working with Ecosystems consisting Hive, Spark, Sqoop, Pig, Oozie, Map Reduce,Tableau, Informatica,Teradata and Core Java.
- Expert in working with Hive in creating tables, data distribution by implementing partitioning and bucketing, writing and optimizing the HiveQL queries .
- Experience in importing and exporting data using Sqoop to HDFS from Relational Database Systems
- Experience in designing both time driven and data driven automated workflows using Oozie.
- Solved performance issues in Hive with understanding of Joins, Group and aggregation.
- Hands on experience of UNIX and shell scripting to automate scripts.
- Hands - on experience in using Hive partitioning, bucketing and execute different types of joins on Hive tables.
- Good Experience in writing complex SQL queries in Hive,Teradata and oracle.
- Good Experience in No SQL database Hbase.
- Knowledge in Kafka in creating consumer and producer APIs for broadcast messaging.
- Good experience in using reporting tools like Tableau and creating reports for hive data.
- Performed all dimensions of development including Extraction, Transformation and Loading data from various sources into Data Warehouses and Data Marts using Power Center (Repository Manager, Designer, Workflow Manager, and Workflow Monitor).
- Key participant in all phases of software development life cycle with Analysis, Design, Development, Integration, Implementation, Debugging, and Testing of Software Applications Strong skills in resolving Teradata issues, providing workaround for the problems.
- Exposed to all the Enhancement/Production support and deliverables.
- Logical, Analytical and good interpersonal skills and commitment to perform quality work.
- Hard-worker with flexible approach.
- Analytical ability, quick learner & zeal to learn new technology & tools.
- Results oriented and able to work independently as well as in team.
TECHNICAL SKILLS:
Technologies: Hive,Spark,Sqoop,Map Reduce,Pig,Kafka,Teradata,Informatica
Operating System: Windows, UNIX,Hadoop
Tools: Tableau,Oozie,SQL Plus, Informatica, Toad,Teradata Sql Assistant,Magellan Tool,Maverick,Event Engine
PROFESSIONAL EXPERIENCE:
Confidential
IT Analyst
Responsibilities:
- Completely involved in the requirement analysis phase.
- Developed Hive scripts to be used by end user / analyst / product manager’s requirements for adhoc analysis.
- Involved in creating Hive internal and external tables, loading them with data and writing hive queries which requires multiple join scenarios. Created partitioned and bucketed tables in Hive based on the hierarchy of the dataset
- Importing and Exporting of data from RDBMS to HDFS and vice versa using Sqoop
- Extensively used UNIX for shell Scripting and pulling the Logs from the Server.
- Migrated existing HiveQL to Spark using Scala as part of POC
- Developed work flow in Oozie to automate the tasks of loading the data into HDFS
- Involved in converting Hive/SQL queries into Spark transformations and actions using Spark SQL (RDDs and Data frames) in Scala
- Implemented Spark SQL queries with Scala for faster testing and processing of data.
- Reviewing peer table creation in Hive, data loading and queries.
- Involved in analyzing system failures, identifying root causes and recommended course of actions.
- Responsible for reviewing the code done by the team and handing over the UAT file for business users for validations.
- Performing code commit, releasing code to production and post implementation validations.
Confidential
IT AnalystResponsibilities:
- Analyzing/Understanding the business Requirements.
- Imported Data from Different Relational Data Sources like RDBMS, Teradata to HDFS using Sqoop.
- Migrating Teradata tables by creating Hive internal and external tables, loading them with data and writing hive queries which requires multiple join scenarios. Created partitioned and bucketed tables in Hive based on the hierarchy of the dataset.
- Analyzed large data sets by running Hive queries and Pig scripts.
- Involved in converting Hive/SQL queries into Spark transformations using Spark RDDs and Scala.
- Loaded and transformed large sets of structured, semi structured and unstructured data.
- Developed the UNIX shell scripts for creating the reports from Hive data.
- Developed the Pig UDF'S to pre-process the data for analysis.
- Developed work flow in Oozie to automate the tasks of loading the data into HDFS
- Experience with Tableau for Data Acquisition and visualizations.
- Moved all log/text files generated by various products into HDFS location
- Roll out the applications to production.
- Performing timely team connect with onsite to keep them updated about the current status and gain knowledge about the newly introduced processes
- Peer Reviews, Performing Unit Testing and Integrated System Testing.
Confidential
IT AnalystResponsibilities:
- Work closely with the business and analytics team in gathering the system requirements.
- Loading Data in to Database using SQL scripts in Pilot phase.
- Creating Views to integrate different sources.
- Developing Procedures to meet the requirements.
- Used Informatica to validate and test the business logic implemented in the mappings and fix the bugs. Developed reusable Mapplets and Transformations.
- Extensively used Slowly Changing Dimensions technique for updating dimensional schema.
- Performing Peer reviews and testing for the workflows.
- Involved in production deployment activities.
- Involved in fixing issues in the existing applications.
- Developed Hive scripts to be used by end user / analyst / product manager’s requirements for adhoc analysis.
Confidential
IT AnalystResponsibilities:
- Analyzing/Understanding the Requirements
- Upgrading informatica from older versions(7.1 and 8.6) to latest version(9.5.1)
- Checking/validating pre-installation requirements for installing informatica in new server.
- Migrating Informatica components to new server and Performing Unit testing and System testing on the components moved.
- Running the workflows using cron scheduler.
- Fixing the issues faced during workflows execution.
- Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and also by using Parameter files.
Confidential
IT AnalystResponsibilities:
- Analyzing/Understanding the Requirement Documents.
- Developed mappings/sessions to import, transform and load data into respective target tables and flat files using Informatica Power Center for data loading.
- Extensively used Transformations like Router, Aggregator, Source Qualifier, Joiner, Expression, Aggregator and Sequence generator by using Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
- Performing unit testing for the workflows.
- Worked on loading of data from flat files sources to Target using Teradata MLOAD, Fast Load, and BTEQ.
- Worked on exporting data to flat files using Teradata FEXPORT.
- Involved in peer reviews for the mappings and validating whether the data populated is as per the requirements.
- Generating and updating reports in BO as per requirement.