Application Developer Resume
2.00/5 (Submit Your Rating)
PROFESSIONAL SUMMARY:
- Techno - functional professional willing to leverage 7+ years of IT solution experience with Investment Banking knowledge on Risk Management into Data Analytics stream.
- 4 years of experience in Big Data Technologies and deep insight on Hadoop Echo System.
- Good expertise in analyzing data using Map Reduce jobs, Hive QL and Pig Latin using Apache Hadoop for working with Big Data.
- Strong Experience in writing Pig-Latin scripts, writing/registering UDFs in Java, creating tables in Pig-Latin and loading data into target tables.
- Skilled in writing Hive scripts, UDFs in Java, creating tables with Partitions, Bucketing and Loading data into HIVE tables.
- Exposure to Spark stack and hands on experience working with RDD and Dataframes.
- Expertise with Apache Parquet, Avro hadoop File formats.
- Experience Using Impala shell and good understanding of impala sql execution engine.
- Worked extensively to integrate Cloudera Impala with BI tool Tableau.
- Skilled in using Apache Oozie Workflow Engine for coordinating the cluster and scheduling workflow jobs.
- Experience in working with BI team and transform big data requirements into Hadoop centric technologies.
- Expertise in developing solutions to analyze large data sets efficiently.
TECHNICAL SKILLS:
H ADOOP E COSYSTEM: HDFS, MapReduce
D ATA W AREHOUSING: Aginity Netezza
P ROGRAMMING L ANGUAGES: JAVA, J2EE
D ATABASE T OOLS: Oracle, MySQL, Microsoft SQL Server 2005
W EB D EVELOPMENT: Struts, Spring, JSP, JavaScript, Servlets, Hibernate, HTML
P ROJECT M ANAGEMENT T OOLS: MS Project 2007, Visio, MS Office, Agile Methodology
PROFESSIONAL EXPERIENCE:
Confidential
APPLICATION DEVELOPER
Responsibilities:
- Created Impala views to integrate with BI tools for faster response time.
- Worked extensively on converting MR tasks to PySpark.
- Worked with sparksql context to create dataframes to filter input data for model execution.
- Used spark’s map transformation on dataframes to calculate loan losses in parallel.
- Converted hadoop text format to Parquet format for fast scanning and access of large datasets.
- Initiated and Implemented Value Added Process on hadoop to interact with UI and load hive tables dynamically based on the user inputs.
- Worked on Sentry migration for hadoop groups and Impala groups.
- Automated Oozie workflows to load staging tables on Netezza on Monthly and Quarterly basis.
- Created a spring batch job to load the Entitlements and permission related information of the Risk Insight portal to Centralised security user database.
- Automated the process of Portal Login notification to dormant users.
- Configured weblogic 12c to create datasource for different databases.
- Setup Bladelogic to manage webserver and Application servers for the UI Portal.
- Integrated Microstrategy and Tableau reports with UI portal using IFrame.
- Created Hive scripts to parse the logs and structure them in tabular format to facilitate effective querying on the log data .
- Worked on automating the process of files deployment to hdfs.
Confidential
SENIOR SOFTWARE ENGINEER
Responsibilities:
- Provided a solution using HIVE, SQOOP (to export/ import data), for faster data load by replacing the traditional ETL process with HDFS for loading data to target tables.
- Created UDF’s and Oozie workflows to sqoop the data from source to hdfs and then to the target tables.
- Worked on Hadoop Schema design, involved in file uploads from UNIX/LINUX file system to Hadoop Distribute File System Environment
- Implemented custom DataTypes, InputFormat, RecordReader, OutputFormat, RecordWriter for Map Reduce computations
- Developed the Pig UDF's to preprocess the data for analysis
- Used Pig Latin scripts to extract the data from the output files, process it and load into HDFS
- Implemented partitioning, dynamic partitions, bucketing in HIVE
- Developed Hive queries to process the data and generate the results in a tabular format
- Continuously monitored and managed the Hadoop cluster through Cloudera Manager
- Handled importing of data from multiple data sources using Sqoop, performed transformations using Hive, MapReduce, and loaded data into HDFS
- Involved in designing and developing non-trivial ETL processes within Hadoop using tools like Pig, Sqoop, Flume, and Oozie.
- Formulated procedures for developing scripts and batch process to schedule Hadoop jobs
- Experienced on loading and transforming of large sets of structured, semi structured and unstructured data showcased strong understanding on Hadoop architecture including HDFS, Hadoop MapReduce, Hive, Pig, Sqoop and Oozie
- Designed performance optimization involving data transmission, data extraction, business validations, service logic and job scheduling
- Written Hive queries for data analysis to meet the business requirements
- Created Hive tables and worked on them using Hive QL
- Segmented the data and created portfolio to feed to Risk Frontier tool to simulate and calculate the CVaR
- Performed VAP on the data for applying business logics using DW tool Netezza
- Built a web based UI using Adobe FLEX for the TOH to perform these operations and auto-generate monthly reports
- Created reports using I-Report Tool and integrated them with the Application
- Wrote complex queries and Stored Procedures in Oracle to generate SER Admin Excel Reports in Apache POI for stakeholders’ (SAG) usage, on a quarterly basis.
- Developed a new feature for Admin to generate reports in Apache POI based on the search criteria provided in an excel sheet format.
- Analyse the impact and the effort to be required for the enhancement or improvement request by the business partners
- Prepare the Impact analysis on the different layers of the application
- Schedule the tasks into various Iterations followed in Agile methodology of development
- Attend the periodic scrum calls for the daily and weekly accomplishments
- Prepare the test scripts of each stories before start coding
- Coding process involves code changes at struts, hibernate configuration files, JSP, Java
- Development a new feature for Administrators to generate reports based on the search criteria’s provided in the excel sheet format using Apache POI
- Created a Complex query and Stored Procedure to generate a SER Admin Report which is used by BP’s on a quarterly basis.
- Creating the reports using I-report Tool and Integrating with the Application
- Maintain the code changes among all developers using the version control tool
- Validating the changes and verifies the test scripts passes and post the story for review
Confidential
SOFTWARE E NGINEER
Responsibilities:
- Created UI screens using the zero degree platform and coded corresponding servlets.
- Integration of web services to the client using Groovy
- Creation of web services to provide additional features to the application
- Creating screens using the zero degree platform and managing the server side scripts
- Responsible for the client side changes for player logic using J2ME