Skip to main content
  1. Articles/

Gathering and Ingesting Data from Different Sources using Numpy and Pandas

·631 words·3 mins
article guide numpy pandas data
Wing Tang Wong
Author
Wing Tang Wong
SRE/DevOps/Platform Engineer/Software Engineer
Table of Contents

Gathering and Ingesting Data from Different Sources using Numpy and Pandas #

Data is the foundation of any data-driven project, and gathering data from various sources is a crucial step in the data analysis pipeline. In this section, we’ll explore how to gather data from different sources, such as websites, Kaggle datasets, and IoT sensors, and then ingest that data using Numpy and Pandas. We’ll also cover data cleaning and handling nonsense values to ensure the data is in a usable and consistent format.

Gathering Data from Different Sources #

Web Scraping #

Web scraping is a common technique to gather data from websites that do not offer an API or a downloadable dataset. Python provides libraries like BeautifulSoup and requests, which are helpful for web scraping.

import requests
from bs4 import BeautifulSoup

# Sending a request to the website
url = "https://example.com/data"
response = requests.get(url)

# Parsing the HTML content using BeautifulSoup
soup = BeautifulSoup(response.text, 'html.parser')

# Extracting relevant data from the webpage
data = []
for item in soup.find_all('div', class_='data-item'):
    data.append(item.text)

Kaggle Datasets #

Kaggle is a popular platform for data science competitions and provides a vast collection of datasets. You can use the Kaggle API or directly download the datasets from the Kaggle website.

# Using Kaggle API to download the dataset
!pip install kaggle

import kaggle

# Replace 'your_username' and 'your_api_key' with your Kaggle credentials
kaggle.api.authenticate(username="your_username", key="your_api_key")
kaggle.api.dataset_download_files('dataset-name', path='./data', unzip=True)

IoT Sensors #

IoT sensors generate real-time data from various devices and sensors. Data can be collected and stored in databases, CSV files, or APIs.

# Example for reading data from a CSV file
import pandas as pd

data = pd.read_csv('sensor_data.csv')

Ingesting Data using Numpy and Pandas #

Now that we have gathered data from different sources, let’s ingest it using Numpy and Pandas.

import numpy as np
import pandas as pd

# Ingesting data using Numpy
data_np = np.array(data)

# Ingesting data using Pandas
data_pd = pd.DataFrame(data)

Data Cleaning and Handling Nonsense Values #

Data collected from various sources might contain missing or erroneous values. Data cleaning involves identifying and handling these missing or nonsense values to ensure data quality and consistency.

Handling Missing Values #

# Numpy: Replace missing values with a default value
data_np = np.nan_to_num(data_np, nan=-1)

# Pandas: Replace missing values with a default value
data_pd.fillna(-1, inplace=True)

Handling Nonsense Values #

Nonsense values can occur when sensors malfunction or data is entered incorrectly.

# Numpy: Replace nonsense values with NaN (Not a Number)
data_np[data_np > 1000] = np.nan

# Pandas: Replace nonsense values with NaN
data_pd[data_pd > 1000] = np.nan

ETL (Extract, Transform, Load) #

The process of gathering data from different sources, cleaning and transforming it into a usable format, and finally loading it into a database or a data warehouse is known as ETL. ETL plays a crucial role in data integration and ensuring data quality for downstream analysis and reporting.

  1. Extract: Gathering data from various sources, such as websites, APIs, databases, or files.

  2. Transform: Cleaning, filtering, and transforming the data into a consistent and usable format.

  3. Load: Storing the transformed data into a database or data warehouse for further analysis.

By utilizing Numpy and Pandas, you can efficiently handle the transformation and cleaning steps of the ETL process, making data ingestion and preparation a streamlined and manageable task.

Conclusion #

Gathering data from different sources is the starting point of any data-driven project. Python libraries like Numpy and Pandas provide powerful tools for ingesting and cleaning the data collected from websites, Kaggle datasets, and IoT sensors. By handling missing and nonsense values, we ensure data quality and consistency, which is crucial for downstream analysis and decision-making. The ETL process, with the help of Numpy and Pandas, plays a pivotal role in transforming raw data into valuable insights and actionable information.

Related

Exploring Numpy and Pandas in Python 3.9: Installation, Benefits, Data Import, Manipulation, and Export
·767 words·4 mins
article guide python numpy pandas etl data
Introduction # Numpy and Pandas are fundamental libraries for data manipulation and analysis in Python.
Python Installation and Virtual Environments: A Quick Guide
·566 words·3 mins
article guide python venv anaconda conda
Introduction # Python has been a staple programming language for Linux administrators and developers alike.
Installing Python 3.9 and Managing Multiple Versions on Mac OS X and Linux
·577 words·3 mins
articles guide python linux mac osx setup
Introduction # Python is a versatile programming language, and it’s essential to have the latest version installed on your system for leveraging the newest features and improvements.
Strong Winds In Bay Area Knock Down Powerlines!
·155 words·1 min
news news weather failures
Strong Winds In California March 14th, 2023 # On March 14th, 2023, as I sat in my car at the “Jack in the Box” fast-food drive-through, little did I know I would witness a rather unexpected event.