Fisheries science is a field whose very foundation (“counting the fish in the ocean”) creates doubt in many anglers' minds. Using smartphones to have recreational anglers upload their fishing information creates doubts in just about everyone’s minds—fisheries scientists included. However, that has not stopped a few groups from steaming forward under the belief that something created by and for anglers will cause them to report honestly and faithfully. The most extensive program to date is the Snook and Gamefish Foundation’s (SGF) iAngler app, the flagship app under its Angler Action Program1. Originally started as a way to provide state scientists with more data on snook fishing in Florida, it has expanded to include fresh- and saltwater fish across the country, inevitably turning some heads around the fisheries community.
Stock
assessment scientists at the Florida Fish and Wildlife Conservation
Commission’s Fish and Wildlife Research Institute (FWRI) were interested in
getting as much data to help with snook assessments, but were also concerned
about the reliability of this information.
Likewise, on the federal level, fisheries scientists with the National
Oceanic and Atmospheric Administration (NOAA) were skeptical about the validity
of data that is self-reported in a non-random manner2. To provide the best chance of getting
information that is representative of the whole angling population, there
should be a fully randomized sample of anglers.
This is not what an app like iAngler does; rather, it is utilized by
whoever is interested in downloading it.
What if only the most talented anglers use it (the anglers most
fisheries and social scientists would expect to use such an app)? Then, the experts are left thinking all the
fishers out there have such success when they drop their lines in the
water. For reasons like this, an analysis
of these volunteer fishing apps is necessary to begin solidifying or revising
our assumptions.
NOAA’s
Marine Recreational Information Program (MRIP) survey is a randomized,
rigorously designed sampling initiative that has interviewers intercepting
anglers at boat ramps and beaches for catch-per-unit-effort (CPUE) information
and calling them on the phone for effort data.
Because, as Professor John Shepherd once said, “Managing fisheries is
hard: it’s like managing a forest, in which the trees are invisible and keep
moving around,” we have no way of knowing the real values of the variety of
fisheries metrics. However, something
like the MRIP provides data about as close to the “truth” any any other program. So when we sought to gauge the validity of
data from the iAngler app, we decided the best path would be to compare its
information to that of the MRIP.
For
specific comparisons, we chose the “Three Wise Men” of fisheries metrics (or
“Three Stooges,” depending on your perception of fisheries): effort, catch, and
catch-per-unit-effort (or catch rate).
The results that followed were in some ways expected, but surprising in
other ways. First, the only place that
had a reasonable number of trips reported under iAngler was south Florida, the
Atlantic side especially (where the app was created). Because of this, a lot of the fishing that
goes on in other parts of Florida is not being captured by the app, so using
this on a statewide scale would be risky.
Also, the scale between the two programs was not comparable; the number
of MRIP boat-ramp interviews dwarfed the number of iAngler reported trips. This app only began in 2012 and has been
spread only by word-of-mouth, so that likely explains its relative size
compared to NOAA’s 35-year-old nationwide sampling program. Also, the focus of the anglers using iAngler
was directed toward Florida’s popular inshore species: common snook, spotted
seatrout, and red drum. Even though
Floridians as a whole also like to fish offshore for snappers, groupers,
billfish, etc., the app is adequately capturing only these three species.
While
this seems to be two strikes against the citizen-driven app, there was one big
question left: how do the catch rates compare?
The spatial bias of southeast Florida might not persist if anglers in
other areas start using the app. And
even though it only has sufficient information for a handful of species,
scientists assess stocks individually anyway.
We moved forward by looking at catch rates for the three inshore fish,
but narrowed our focus to iAngler’s “hotspots,” in other words, south Florida
specifically. This allowed the
comparison to the MRIP’s catch rates to be more representative than a statewide
comparison. When we added this
specification, the iAngler catch rates were very similar to those of the
MRIP—for each of the three fish we considered.
This is surprising from a statistical standpoint, given the fact that
these anglers were not randomly chosen to participate—it was voluntary, and
thus, non-random.
To
sum it up, SGF’s iAngler app provides recreational fisheries information that
is spatially biased toward south Florida and contains mostly information on
snook, seatrout, and red drum. However,
when appropriate comparisons are made, the catch rates given by anglers are
very similar to those estimated by the MRIP survey. If the participation were to increase and
become more balanced throughout the state, a program like iAngler could provide
valuable data to fisheries scientists, especially for relatively rare and
perhaps poorly sampled fisheries like snook.
It even has some advantages over traditional survey methods like the one
utilized by the MRIP. Because boat-ramp
interviews take place after a trip is completed, they miss a lot of detailed
information about the fish that were thrown back—which is a lot of fish in
Florida’s fisheries. Users of an app
like iAngler can submit size, weight, and other information about every fish
they caught, and not just the ones they brought back to land. Self-reporting programs will always carry an
undeniable statistical risk, but being aware of and accounting for potential
biases could give programs like iAngler a place in future recreational
fisheries management.
1 1. Information
about the Angler Action Program:
2 2. The
link below provides a good summary of the risks of using non-random, self-reported
data for fisheries science:
Big fish see now!click here!
ReplyDeleteCute video downlaod now!!