Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

sam3

Evaluation of the SAM3 model for detection of coconut palm trees in images

Introduction

This page documents my evaluation of SAM3, the most recent Segment Anything Model, for automated detection of coconut palm trees.

Code associated with this page is available at https://github.com/aubreymoore/sam3-2026-01-31

We start by running SAM3 on two test images:

A simple test image posted on the internet by the New York Times.

Figure 1:A simple test image posted on the internet by the New York Times.

A complex test image from a roadside coconut rhinoceros damage survey conducted on Efate Island, Vanuatu.

Figure 2:A complex test image from a roadside coconut rhinoceros damage survey conducted on Efate Island, Vanuatu.

SAM3 detection results for a simple image

SAM3 performed very well on this image, returning high confidence detections and precise segmentation even tough the two coconut palms were heavily damaged (Figure 3).

SAM3 detection results from the simple image. This is the default annotated image returned by SAM3.
Numbers are confidence levels.

Figure 3:SAM3 detection results from the simple image. This is the default annotated image returned by SAM3. Numbers are confidence levels.

SAM3 detection results for a complex image

SAM3 detected 25 coconut palms in this complex image. The default annotation image returned by SAM3 is too cluttered to be of much use,so I wrote my own code to display each detection separately. The resulting images are displayed descending order of confidence.

A first look at these detections indicates that SAM3 does a remarkable job at detecting coconut palms in a complex image. It even finds dead standing stems without fronds and small objects.

There are no obvious false positive detections. However, a few detections include two or more coconut palms. Many of the segmentation masks are incomplete because palms are occluded by foreground objects.

SAM3 detection results from the complex image. This is the default annotation returned by SAM3.
Numbers are confidence levels.

Figure 4:SAM3 detection results from the complex image. This is the default annotation returned by SAM3. Numbers are confidence levels.


Detected object 02 confidence: 0.842 object_index: 16

Attributes


Detected object 03 confidence: 0.771 object_index: 5

Attributes

Detected object 04 confidence: 0.764 object_index: 9

Attributes


Detected object 05 confidence: 0.743 object_index: 3

Attributes


Detected object 06 confidence: 0.730 object_index: 12

Attributes


Detected object 07 confidence: 0.699 object_index: 2

Attributes


Detected object 08 confidence: 0.691 object_index: 20

Attributes


Detected object 09 confidence: 0.678 object_index: 7

Attributes


Detected object 10 confidence: 0.650 object_index: 0

Attributes


Detected object 11 confidence: 0.630 object_index: 24

Attributes


Detected object 12 confidence: 0.616 object_index: 8

Attributes


Detected object 13 confidence: 0.568 object_index: 22

Attributes


Detected object 14 confidence: 0.535 object_index: 15

Attributes


Detected object 15 confidence: 0.511 object_index: 14

Attributes


Detected object 16 confidence: 0.480 object_index: 4

Attributes


Detected object 17 confidence: 0.470 object_index: 11

Attributes


Detected object 18 confidence: 0.458 object_index: 10

Attributes


Detected object 19 confidence: 0.412 object_index: 21

Attributes


Detected object 20 confidence: 0.402 object_index: 19

Attributes


Detected object 21 confidence: 0.397 object_index: 23

Attributes


Detected object 22 confidence: 0.348 object_index: 6

Attributes


Detected object 23 confidence: 0.344 object_index: 18

Attributes


Detected object 24 confidence: 0.320 object_index: 13

Attributes


Detected object 25 confidence: 0.281 object_index: 17

Attributes


Detected object 26 confidence: 0.260 object_index: 1

Attributes


Results

My table

nconfiaccepthealthydamagedvcutsdeadcrowdoccludedother
20.84216TrueFalseTrueFalseFalseFalseFalseFalse
30.7715FalseTrueFalseFalseFalseTrueFalseFalse
40.7649TrueFalseTrueFalseFalseFalseFalseFalse
50.7433FalseTrueFalseFalseFalseFalseTrueFalse
60.7312TrueFalseTrueFalseFalseFalseFalseFalse
70.6992TrueFalseTrueFalseFalseFalseFalseFalse
80.69120FalseFalseFalseFalseFalseFalseTrueFalse
90.6787TrueFalseTrueFalseFalseFalseFalseFalse
100.650TrueFalseTrueFalseFalseFalseFalseFalse
110.6324FalseFalseFalseFalseFalseTrueFalseFalse
120.6168TrueFalseTrueTrueFalseFalseFalseFalse
130.56822TrueFalseTrueFalseFalseFalseFalseFalse
140.53515FalseFalseFalseFalseFalseFalseTrueFalse
150.51114TrueFalseFalseFalseTrueFalseFalseFalse
160.484TrueFalseFalseFalseTrueFalseFalseFalse
170.4711FalseTrueFalseFalseFalseFalseTrueFalse
180.45810TrueFalseTrueTrueFalseFalseFalseFalse
190.41221FalseFalseFalseFalseFalseFalseTrueFalse
200.40219FalseFalseFalseFalseFalseFalseTrueFalse
210.39723FalseFalseFalseFalseFalseFalseTrueFalse
220.3486FalseFalseFalseFalseFalseFalseTrueFalse
230.34418FalseFalseFalseFalseFalseTrueFalseFalse
240.3213FalseFalseFalseFalseFalseFalseTrueFalse
250.28117FalseFalseFalseFalseFalseTrueFalseFalse
260.261FalseFalseFalseFalseFalseFalseTrueFalse

References