prithivMLmods commited on
Commit
a209f37
·
verified ·
1 Parent(s): b8fd594

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +73 -0
README.md CHANGED
@@ -16,6 +16,9 @@ tags:
16
  - Siglip2
17
  - ViT
18
  ---
 
 
 
19
 
20
  ```py
21
  Classification Report:
@@ -30,3 +33,73 @@ Bleached Corals 0.8677 0.7561 0.8081 4850
30
  ```
31
 
32
  ![download (1).png](https://cdn-uploads.huggingface.co/production/uploads/65bb837dbfb878f46c77de4c/KMlOnMf0JTq1-5_7qGhjL.png)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
16
  - Siglip2
17
  - ViT
18
  ---
19
+ # **Coral-Health**
20
+
21
+ > **Coral-Health** is an image classification vision-language encoder model fine-tuned from **google/siglip2-base-patch16-224** for a single-label classification task. It is designed to classify coral reef images into two health conditions using the **SiglipForImageClassification** architecture.
22
 
23
  ```py
24
  Classification Report:
 
33
  ```
34
 
35
  ![download (1).png](https://cdn-uploads.huggingface.co/production/uploads/65bb837dbfb878f46c77de4c/KMlOnMf0JTq1-5_7qGhjL.png)
36
+
37
+ The model categorizes images into two classes:
38
+
39
+ - **Class 0:** Bleached Corals
40
+ - **Class 1:** Healthy Corals
41
+
42
+ ---
43
+
44
+ # **Run with Transformers 🤗**
45
+
46
+ ```python
47
+ !pip install -q transformers torch pillow gradio
48
+ ```
49
+
50
+ ```python
51
+ import gradio as gr
52
+ from transformers import AutoImageProcessor
53
+ from transformers import SiglipForImageClassification
54
+ from PIL import Image
55
+ import torch
56
+
57
+ # Load model and processor
58
+ model_name = "prithivMLmods/Coral-Health"
59
+ model = SiglipForImageClassification.from_pretrained(model_name)
60
+ processor = AutoImageProcessor.from_pretrained(model_name)
61
+
62
+ # Updated labels
63
+ labels = {
64
+ "0": "Bleached Corals",
65
+ "1": "Healthy Corals"
66
+ }
67
+
68
+ def coral_health_detection(image):
69
+ """Predicts the health condition of coral reefs in the image."""
70
+ image = Image.fromarray(image).convert("RGB")
71
+ inputs = processor(images=image, return_tensors="pt")
72
+
73
+ with torch.no_grad():
74
+ outputs = model(**inputs)
75
+ logits = outputs.logits
76
+ probs = torch.nn.functional.softmax(logits, dim=1).squeeze().tolist()
77
+
78
+ predictions = {labels[str(i)]: round(probs[i], 3) for i in range(len(probs))}
79
+
80
+ return predictions
81
+
82
+ # Create Gradio interface
83
+ iface = gr.Interface(
84
+ fn=coral_health_detection,
85
+ inputs=gr.Image(type="numpy"),
86
+ outputs=gr.Label(label="Prediction Scores"),
87
+ title="Coral Health Detection",
88
+ description="Upload an image of coral reefs to classify their condition as Bleached or Healthy."
89
+ )
90
+
91
+ # Launch the app
92
+ if __name__ == "__main__":
93
+ iface.launch()
94
+ ```
95
+
96
+ ---
97
+
98
+ # **Intended Use:**
99
+
100
+ The **Coral-Health** model is designed to support marine conservation and environmental monitoring. Potential use cases include:
101
+
102
+ - **Coral Reef Monitoring:** Helping scientists and conservationists track coral bleaching events.
103
+ - **Environmental Impact Assessment:** Analyzing reef health in response to climate change and pollution.
104
+ - **Educational Tools:** Raising awareness about coral reef health in classrooms and outreach programs.
105
+ - **Automated Drone/ROV Analysis:** Enhancing automated underwater monitoring workflows.