Precision Practice Problem
This data science coding problem helps you practice Evaluation Metrics, precision, and implementation skills. Read the problem statement, write your solution, and strengthen your understanding of Evaluation Metrics.
- Problem ID: 22
- Problem key: 22-precision
- URL: https://datacrack.app/solve/22-precision
- Difficulty: easy
- Topic: Evaluation Metrics
- Module: Introduction to Machine Learning
Problem Statement
## 🧩 Precision Score
### 🎯 Goal
Compute the **precision** of a binary classification model.
---
### 📥 Input / 📤 Output
**Input**
- `y_true` (`list[int]`): true labels (0 or 1)
- `y_pred` (`list[int]`): predicted labels (0 or 1)
✅ **Assumption:**
- `y_true` and `y_pred` have the same non-zero length.
- Labels are binary (0 or 1).
**Output**
- `float`: precision score
---
### 💻 Task Description
Precision measures how reliable the model’s **positive predictions** are.
It is defined as:
$$
\text{Precision} = \frac{TP}{TP + FP}
$$
where:
- **TP (True Positives)**: predicted 1 and actually 1
- **FP (False Positives)**: predicted 1 but actually 0
---
### 🧩 Starter Code
```python
def precision_score(y_true, y_pred):
pass
```
⸻
💡 Example
```python
y_true = [1, 0, 1, 1, 0]
y_pred = [1, 1, 1, 0, 0]
precision_score(y_true, y_pred)
```
Expected Output
```
0.6666666667
```
(2 true positives out of 3 predicted positives)
⸻
Starter Code
def precision_score(y_true, y_pred):
pass