I'm a brand new Data Analyst and visualization with a strong background in mathematies and scientific research. A former faculty member and expert to take care of logical communications and analysis of applied problems and algorithms.
Data Analysis & Visualization
An intensive program focused on gaining technical programming skills in Excel, VBA, Python, R, JavaScript, SQL Databases, Tableau, Big Data, and Machine Learning.
Ph.D in Applied Mathematics (Numerical Analysis)
A focused five-year program of research on solving Integral Equations numerically using Triangular Functions. The results of the research include a comprehensive numerical method inspired from a physics technique with validity and accuracy which can be used for functional equations including IEs and IDEs of the first and second kind, singular, system of nonlinear IEs and IDEs and also optimal control time-varying and time delay systems governed by IEs. The achievements of this program include more than ten ISI articles published in international scientific journals.
M.Sc. in Applied Mathematics (Optimal Control)
A study of singular systems in control theory and a comparison of the present numerical methods
B.Sc. in Applied Mathematics
Spotify's popularity index uses an algorithm to evaluate a song, largely based on the number of plays a song gets. The better the evaluation, the more likely an artist and their music will be recommended to a larger number of viewers. Said simply, the greater a song's Popularity index, the greater its presence in the Spotify community. The goal of this project will be to determine what features impact the Popularity index of any given song on Spotify. By analyzing various physical features of a song as well as some peripheral features, we look to identify a pattern within how a song garners a successful Popularity index. We have created a model that can predict the popularity of a song based on these features.
View CodeA neural network is a powerful machine learning technique that is modeled after neurons in the brain. Neural networks can drive the performance of the robust and statistical algorithms without having worry about any statistical theory. Because of that, neural network is one of the high demand skills for any data scientist. We explore and implement neural networks to analyze and classify the success of charitable donations using the TensorFlow platform in Python. Finally, we store and retrieve trained models for more robust uses. However, we spent most of our time for preparing data.
View CodeThe cryptocurrency data we will be working with is not ideal, so it will need to be processed to fit the machine learning models. Since there is no known output for what we are looking for, therefore, We dive into machine learning using unsupervised algorithms. In fact, we work primarily with the K-means algorithm, the main unsupervised algorithm that groups similar data into clusters. Then, we build on this by speeding up the process using principal component analysis (PCA), which employs many different features.
View CodeAll around the world, people borrow money to produce cars, homes, start bussinesses and persue education. Loans are essential part of modern society and present an opportunity and challenge for bank and another lender institutions. On one hand, loans create revenue with the interest rate generated, on the other hand, there is a risk of not paying people and lenders may loose money. Banks rely on income, credit scores and people's assets to reduce the risk. By increasing the financial technologies and using machine learning, bank can analyze this risk. Machine learning can process data to receive a single decision.
View Code" Big Data " is a popular buzzword in the data industry today. But what does the term mean, and why is it important in the broader context of data science. In this analysis, we dig into the industry definition of "big data". We explore big data ecosystem including Spark which improves the process for handling big data versus Hadoop. After diving into some of the technologies, used with big data, we perform ETL on a dataset from Amazon Web Service (AWS). Cloud service let us store large amounts of data at remote locations rather than locally, on top of many other services. This allows for more scalability and performance.
View CodeUsing the knowledge of JavaScript and D3 library, we will traverse and retrieve GeoJSON earthquakes data and tectonic plate data in order to populate a geographical map. For this, we also use a Leaflet library and the Mapbox API. Therefore, we use the Leaflet. js Application Programming Interface (API) to do the following steps: