Elementwise Vectorized Operations Practice Problem
This data science coding problem helps you practice Broadcasting & Vectorization, elementwise vectorized operations, and implementation skills. Read the problem statement, write your solution, and strengthen your understanding of Broadcasting & Vectorization.
- Problem ID: 106
- Problem key: 106-elementwise-vectorized-operations
- URL: https://datacrack.app/solve/106-elementwise-vectorized-operations
- Difficulty: easy
- Topic: Broadcasting & Vectorization
- Module: NumPy Foundations
Problem Statement
# 🧩 Elementwise Vectorized Operations
---
### 🎯 Goal
Vectorized operations apply a function to **every element** of an array simultaneously — without writing a single Python loop. This problem teaches you to implement three essential ML activation functions (`relu`, `sigmoid`, `tanh`) using pure NumPy. These functions are the building blocks of every neural network, and understanding their vectorized implementation is the first step toward building ML models from scratch.
---
### 🔍 The Three Activation Functions
| Function | Formula | Output Range | Use |
|:---------|:--------|:-------------|:----|
| **ReLU** | $\max(0, x)$ | $[0, +\infty)$ | Hidden layers in deep networks |
| **Sigmoid** | $\frac{1}{1 + e^{-x}}$ | $(0, 1)$ | Binary classification output |
| **Tanh** | $\tanh(x)$ | $(-1, 1)$ | Hidden layers; centered around 0 |
All three can be implemented with a **single NumPy expression** — no loops, no list comprehensions:
```python
relu = np.maximum(0, arr)
sigmoid = 1 / (1 + np.exp(-arr))
tanh = np.tanh(arr)
```
---
### 💻 Task
Implement `apply_activation(data, func_name)` that dispatches to the correct activation function.
---
### 📥 Input
- `data`: list of numbers (1D or 2D)
- `func_name`: string — `"relu"`, `"sigmoid"`, or `"tanh"`
### 📤 Output
- Transformed array as a Python list (same shape as input)
---
### 🧩 Starter Code
```python
import numpy as np
def apply_activation(data, func_name):
"""
Apply an activation function element-wise.
Args:
data (list): Input numbers (1D or 2D)
func_name (str): "relu", "sigmoid", or "tanh"
Returns:
list: Transformed array as a Python list
"""
arr = np.array(data, dtype=float)
# 🧠 TODO: Check which activation function was requested
# 🧠 TODO: Apply the correct NumPy operation to the whole array
# 🧠 TODO: Return the result as a Python list
pass
```
---
### 💡 Example
```python
apply_activation([-2, -1, 0, 1, 2], "relu")
# Expected: [0.0, 0.0, 0.0, 1.0, 2.0]
apply_activation([-2, -1, 0, 1, 2], "sigmoid")
# Expected: [0.119..., 0.269..., 0.5, 0.731..., 0.881...]
apply_activation([[-1, 2], [-3, 4]], "relu")
# Expected: [[0.0, 2.0], [0.0, 4.0]]
```
---
### 🔑 Key Concepts
- `np.maximum(0, arr)` applies `max(0, x)` to every element — this is vectorized ReLU
- `np.exp(-arr)` computes $e^{-x}$ for every element simultaneously
- These operations work on **any shape** — 1D, 2D, or higher — without code changes
- The loop equivalent `[max(0, x) for x in data]` only works on 1D and is much slower for large arrays