Caroline Delong - Academia.edu (original) (raw)
Papers by Caroline Delong
Proceedings of LAW VIII - The 8th Linguistic Annotation Workshop, 2014
Clinical decision-making has high-stakes outcomes for both physicians and patients, yet little re... more Clinical decision-making has high-stakes outcomes for both physicians and patients, yet little research has attempted to model and automatically annotate such decision-making. The dual process model posits two types of decision-making, which may be ordered on a continuum from intuitive to analytical . Training clinicians to recognize decision-making style and select the most appropriate mode of reasoning for a particular context may help reduce diagnostic error ). This study makes preliminary steps towards detection of decision style, based on an annotated dataset of image-based clinical reasoning in which speech data were collected from physicians as they inspected images of dermatological cases and moved towards diagnosis . A classifier was developed based on lexical, speech, disfluency, physician demographic, cognitive, and diagnostic difficulty features. Using random forests for binary classification of intuitive vs. analytical decision style in physicians' diagnostic descriptions, the model improved on the baseline by over 30%. The introduced computational model provides construct validity for decision styles, as well as insights into the linguistic expression of decision-making. Eventually, such modeling may be incorporated into instructional systems that teach clinicians to become more effective decision makers. This work is licenced under a Creative Commons Attribution 4.0 International License. Page numbers and proceedings footer are added by the organizers. License details: http://creativecommons.org/licenses/by/4.0/
Proceedings of BioNLP 2014, 2014
The dual process model posits two types of decision-making, which may be ordered on a continuum f... more The dual process model posits two types of decision-making, which may be ordered on a continuum from intuitive to analytical . This work uses a dataset of narrated image-based clinical reasoning, collected from physicians as they diagnosed dermatological cases presented as images. Two annotators with training in cognitive psychology assigned each narrative a rating on a four-point decision scale, from intuitive to analytical. This work discusses the annotation study, and makes contributions for resource creation methodology and analysis in the clinical domain.
Comparative Cognition & Behavior Reviews, 2008
... pp 46-65. Echoic Object Recognition by the Bottlenose Dolphin. Heidi E. Harley New College of... more ... pp 46-65. Echoic Object Recognition by the Bottlenose Dolphin. Heidi E. Harley New College of Florida The Seas®, Epcot, Walt Disney World® Resort Caroline M. DeLong University of Hawaii. Object recognition, essential to ...
Journal of Comparative Physiology
Big brown bats (Eptesicus fuscus) use biosonar to find insect prey in open areas, but they also f... more Big brown bats (Eptesicus fuscus) use biosonar to find insect prey in open areas, but they also find prey near vegetation and even fly through vegetation when in transit from roosts to feeding sites. To evaluate their reactions to dense, distributed clutter, bats were tested in an obstacle array consisting of rows of vertically hanging chains. Chains were removed from the array to create a curved corridor of three clutter densities (high, medium, low). Bats flew along this path to receive a food reward after landing on the far wall. Interpulse intervals (IPIs) varied across clutter densities to reflect different compromises between using short IPIs for gathering echoes rapidly enough to maneuver past the nearest chains and using longer IPIs so that all echoes from one sound can be received before the next sound is emitted. In high-clutter density, IPIs were uniformly shorter (20-65 ms) than in medium and low densities (40-100 ms) and arranged in "strobe groups," with some ...
Animal cognition, 2014
Object constancy, the ability to recognize objects despite changes in orientation, has not been w... more Object constancy, the ability to recognize objects despite changes in orientation, has not been well studied in the auditory modality. Dolphins use echolocation for object recognition, and objects ensonified by dolphins produce echoes that can vary significantly as a function of orientation. In this experiment, human listeners had to classify echoes from objects varying in material, shape, and size that were ensonified with dolphin signals. Participants were trained to discriminate among the objects using an 18-echo stimulus from a 10° range of aspect angles, then tested with novel aspect angles across a 60° range. Participants were typically successful recognizing the objects at all angles (M = 78 %). Artificial neural networks were trained and tested with the same stimuli with the purpose of identifying acoustic cues that enable object recognition. A multilayer perceptron performed similarly to the humans and revealed that recognition was enabled by both the amplitude and frequenc...
The Journal of the Acoustical Society of America, 2006
The focus of this study was to investigate how dolphins use acoustic features in returning echolo... more The focus of this study was to investigate how dolphins use acoustic features in returning echolocation signals to discriminate among objects. An echolocating dolphin performed a match-to-sample task with objects that varied in size, shape, material, and texture. After the task was completed, the features of the object echoes were measured (e.g., target strength, peak frequency). The dolphin's error patterns were examined in conjunction with the between-object variation in acoustic features to identify the acoustic features that the dolphin used to discriminate among the objects. The present study explored two hypotheses regarding the way dolphins use acoustic information in echoes: (1) use of a single feature, or (2) use of a linear combination of multiple features. The results suggested that dolphins do not use a single feature across all object sets or a linear combination of six echo features. Five features appeared to be important to the dolphin on four or more sets: the echo spectrum shape, the pattern of changes in target strength and number of highlights as a function of object orientation, and peak and center frequency. These data suggest that dolphins use multiple features and integrate information across echoes from a range of object orientations.
The Journal of the Acoustical Society of America, 2004
ABSTRACT Echolocating dolphins extract object feature information from the acoustic parameters of... more ABSTRACT Echolocating dolphins extract object feature information from the acoustic parameters of object echoes. However, little is known about which object features are salient to dolphins or how they extract those features. To gain insight into how dolphins might be extracting feature information, human listeners were presented with echoes from objects used in a dolphin echoic-visual cross-modal matching task. Human participants performed a task similar to the one the dolphin had performed; however, echoic samples consisting of 23-echo trains were presented via headphones. The participants listened to the echoic sample and then visually selected the correct object from among three alternatives. The participants performed as well as or better than the dolphin (M=88.0% correct), and reported using a combination of acoustic cues to extract object features (e.g., loudness, pitch, timbre). Participants frequently reported using the pattern of aural changes in the echoes across the echo train to identify the shape and structure of the objects (e.g., peaks in loudness or pitch). It is likely that dolphins also attend to the pattern of changes across echoes as objects are echolocated from different angles.
The Journal of the Acoustical Society of America, 2008
Big brown bats were trained in a two-choice task to locate a two-cylinder dipole object with a co... more Big brown bats were trained in a two-choice task to locate a two-cylinder dipole object with a constant 5 cm spacing in the presence of either a one-cylinder monopole or another two-cylinder dipole with a shorter spacing. For the dipole versus monopole task, the objects were either stationary or in motion during each trial. The dipole and monopole objects varied from trial to trial in the left-right position while also roving in range (10-40 cm), cross range separation (15-40 cm), and dipole aspect angle (0 degrees -90 degrees ). These manipulations prevented any single feature of the acoustic stimuli from being a stable indicator of which object was the correct choice. After accounting for effects of masking between echoes from pairs of cylinders at similar distances, the bats discriminated the 5 cm dipole from both the monopole and dipole alternatives with performance independent of aspect angle, implying a distal, spatial object representation rather than a proximal, acoustic object representation.
The Journal of the Acoustical Society of America, 2006
ABSTRACT Insectivorous big brown bats (Eptesicus fuscus) use frequency‐modulated ultrasonic echol... more ABSTRACT Insectivorous big brown bats (Eptesicus fuscus) use frequency‐modulated ultrasonic echolocation calls to locate and capture prey, often while navigating through highly cluttered areas of vegetation. To test how their calls change while flying through different clutter densities, a matrix of vertically hanging chain links was constructed in a 4.5‐m‐wide, 10.5‐m‐long, and 2.6‐m‐high flight room. Three different clutter densities (low, medium, and high) were created by varying the number of chains in the matrix (9, 114, and 150, respectively). Four wild‐born bats were trained to fly through curved gaps in the chain network. These flights were recorded in the dark with a stereoscopic pair of thermal imagingcameras and a heterodyne bat detector. The bats were flown first through the high‐, then the low‐, and finally the medium‐density configuration over a period of 40 days. Preliminary analysis of the pulse intervals of the bats sounds during flight reveals that the interpulse intervals shorten considerably under high‐clutter conditions as compared to medium or low clutter. The data suggest that bats flying through high clutter become limited to shorter interpulse intervals in order to ensonify their immediate environment at the expense of larger‐scale navigation. [Work supported by NIH.]
The Journal of the Acoustical Society of America, 2013
ABSTRACT Dolphins naturally recognize objects from multiple angles using echolocation. With train... more ABSTRACT Dolphins naturally recognize objects from multiple angles using echolocation. With training, humans can also learn to accurately classify objects based on their echoic features. In this study, we used neural networks to identify acoustic cues that enable objects to be recognized from varying aspects. In simulation 1, a self-organizing map was able to differentiate a subset of objects using only amplitude and frequency cues, but it classified some echoes from different objects as being from the same object. In simulation 2, a multilayer perceptron was trained through error correction to identify objects based on echoes from a single aspect, and then tested on its ability to recognize those objects using echoes from different orientations. Overall, perceptrons performed similarly to trained undergraduates. Analysis of network connection weights revealed that both the amplitude and frequency of echoes, as well as the temporal dynamics of these features over the course of an echo train, enabled perceptrons to accurately identify objects when presented with novel orientations. These findings suggest that learning may strongly impact an organism's ability to echoically recognize an object from any viewpoint.
The Journal of the Acoustical Society of America, 2013
ABSTRACT Object constancy, the ability to recognize objects despite changes in orientation, has n... more ABSTRACT Object constancy, the ability to recognize objects despite changes in orientation, has not been well studied in the auditory modality. Dolphins use echolocation for object recognition, and objects ensonified by dolphins produce echoes that can vary significantly as a function of orientation. In four experiments, human listeners had to classify echoes from objects ensonified with dolphin signals. Participants were trained to discriminate among the objects using an 18-echo stimulus from a 10 degree range of aspect angles, then tested with novel aspect angles across a 60 degree range. In the first two experiments, the three objects varied in material, size, and shape. Participants were typically successful recognizing the objects at all angles (M = 78%). In experiment 3, the three objects had the same material but different shapes. Participants were often unsuccessful recognizing the objects at all angles (M = 46%). In experiment 4, participants had to classify echoes from four fish species across a wider range of angles (330 degrees). Preliminary results show overall poor performance (M = 45%). These results suggest that object characteristics play a role in whether performance is more view-dependent or view-invariant. These studies can provide insight into the process dolphins use to identify objects.
The Journal of the Acoustical Society of America, 2008
Echolocating big brown bats (Eptesicus fuscus) frequently catch insects during aerial pursuits in... more Echolocating big brown bats (Eptesicus fuscus) frequently catch insects during aerial pursuits in open spaces, but they also capture prey swarming on vegetation, and from substrates. To evaluate perception of targets on cluttered surfaces, big brown bats were trained in a two-alternative forced-choice task to locate a target, varying in height, that was embedded partway in holes (clutter) cut in a foam surface. The holes were colocalized with the possible positions of the target at distances ranging from 25 to 35 cm. For successful perception of the target, the bat had to detect the echoes contributed by the target in the same time window that contained echoes from the clutter. Performance was assessed in terms of target reflective strength relative to clutter strength in the same time window. The bats detected the target whenever the target strength was greater than 1-2 dB above the clutter.
The Journal of the Acoustical Society of America, 2007
This study documents the changes in peak frequency, source level, and spectrum shape of echolocat... more This study documents the changes in peak frequency, source level, and spectrum shape of echolocation clicks made by the same dolphin performing the same discrimination task in 1998 and in 2003/2004 with spherical solid stainless steel and brass targets. The total average peak frequency used in 1998 was 138 kHz but in 2003/2004 it had shifted down nearly 3.5 octaves to 40 kHz. The total average source level also shifted down from 206 dB in 1998 to 187 kHz in 2003/2004. The standard deviation of these parameter values within time periods was small indicating a consistent difference between time periods. The average parameter values for clicks used when exposed to brass versus steel targets were very similar indicating that target type did not greatly influence the dolphin's average echolocation behavior. The spectrum shapes of the average clicks used in 1998 and in 2003/2004 were nearly mirror images of each other with the peak energy in 2003/2004 being concentrated where the 1998 clicks had the lowest energy content and vice versa. Despite the dramatic differences in click frequency content the dolphin was able to perform the same discrimination task at nearly the same level of success.
The Journal of the Acoustical Society of America, 2007
Echolocating dolphins extract object feature information from the acoustic parameters of echoes. ... more Echolocating dolphins extract object feature information from the acoustic parameters of echoes. To gain insight into which acoustic parameters are important for object discrimination, human listeners were presented with echoes from objects used in two discrimination tasks performed by dolphins: Hollow cylinders with varying wall thicknesses (+/-0.2, 0.3, 0.4, and 0.8 mm), and spheres made of different materials (steel, aluminum, brass, nylon, and glass). The human listeners performed as well or better than the dolphins at the task of discriminating between the standard object and the comparison objects on both the cylinders (humans=97.1%; dolphin=82.3%) and the spheres (humans= 86.6%; dolphin= 88.7%). The human listeners reported using primarily pitch and duration to discriminate among the cylinders, and pitch and timbre to discriminate among the spheres. Dolphins may use some of the same echo features as the humans to discriminate among objects varying in material or structure. Human listening studies can be used to quickly identify salient combinations of echo features that permit object discrimination, which can then be used to generate hypotheses that can be tested using dolphins as subjects.
The Journal of the Acoustical Society of America, 2011
ABSTRACT Previous research has shown that musical training affects the type of cues people use to... more ABSTRACT Previous research has shown that musical training affects the type of cues people use to discriminate between auditory stimuli. The current study investigated whether quantity of musical training and musical area of expertise (voice, percussion instrument, non-percussion instrument) affected musical feature perception. Participants with 0-4 yr of experience (13 non-musicians), 5-7 yr of experience (13 intermediate musicians), and 8 yr or more of experience (13 advanced musicians) were presented with pairs of 2.5 s novel music sequences that were identical (no change trials), differed by one musical feature (pitch change, timber change, or rhythm change), and differed by two musical features (pitch and timber change, pitch and rhythm change, or timber and rhythm change). In 64 trials, participants had to report whether they heard a change, as well as classify the specific type of change. Participants in the advanced group (M = 91.2%) and intermediate groups (M = 85.0%) performed significantly better than non-musicians (M = 70.0%). There was no effect of area of musical expertise (voice or instrument) on musical feature change detection. These results suggest that musical training in any area increases the ability to perceive changes in pitch, timber, and rhythm across unfamiliar auditory sequences.
The Journal of the Acoustical Society of America, 2000
ABSTRACT Animal behavior experiments require not only stimulus control of the animal's be... more ABSTRACT Animal behavior experiments require not only stimulus control of the animal's behavior, but also precise control of the stimulus itself. In discrimination experiments with real target presentation, the complex interdependence between the physical dimensions and the backscattering process of an object make it difficult to extract and control relevant echo parameters separately. In other phantom-echo experiments, the echoes were relatively simple and could only simulate certain properties of targets. The echo-simulation method utilized in this paper can be used to transform any animal echolocation sound into phantom echoes of high fidelity and complexity. The developed phantom-echo system is implemented on a digital signal-processing board and gives an experimenter fully programmable control over the echo-generating process and the echo structure itself. In this experiment, the capability of a dolphin to discriminate between acoustically simulated phantom replicas of targets and their real equivalents was tested. Phantom replicas were presented in a probe technique during a materials discrimination experiment. The animal accepted the phantom echoes and classified them in the same manner as it classified real targets.
Proceedings of LAW VIII - The 8th Linguistic Annotation Workshop, 2014
Clinical decision-making has high-stakes outcomes for both physicians and patients, yet little re... more Clinical decision-making has high-stakes outcomes for both physicians and patients, yet little research has attempted to model and automatically annotate such decision-making. The dual process model posits two types of decision-making, which may be ordered on a continuum from intuitive to analytical . Training clinicians to recognize decision-making style and select the most appropriate mode of reasoning for a particular context may help reduce diagnostic error ). This study makes preliminary steps towards detection of decision style, based on an annotated dataset of image-based clinical reasoning in which speech data were collected from physicians as they inspected images of dermatological cases and moved towards diagnosis . A classifier was developed based on lexical, speech, disfluency, physician demographic, cognitive, and diagnostic difficulty features. Using random forests for binary classification of intuitive vs. analytical decision style in physicians' diagnostic descriptions, the model improved on the baseline by over 30%. The introduced computational model provides construct validity for decision styles, as well as insights into the linguistic expression of decision-making. Eventually, such modeling may be incorporated into instructional systems that teach clinicians to become more effective decision makers. This work is licenced under a Creative Commons Attribution 4.0 International License. Page numbers and proceedings footer are added by the organizers. License details: http://creativecommons.org/licenses/by/4.0/
Proceedings of BioNLP 2014, 2014
The dual process model posits two types of decision-making, which may be ordered on a continuum f... more The dual process model posits two types of decision-making, which may be ordered on a continuum from intuitive to analytical . This work uses a dataset of narrated image-based clinical reasoning, collected from physicians as they diagnosed dermatological cases presented as images. Two annotators with training in cognitive psychology assigned each narrative a rating on a four-point decision scale, from intuitive to analytical. This work discusses the annotation study, and makes contributions for resource creation methodology and analysis in the clinical domain.
Comparative Cognition & Behavior Reviews, 2008
... pp 46-65. Echoic Object Recognition by the Bottlenose Dolphin. Heidi E. Harley New College of... more ... pp 46-65. Echoic Object Recognition by the Bottlenose Dolphin. Heidi E. Harley New College of Florida The Seas®, Epcot, Walt Disney World® Resort Caroline M. DeLong University of Hawaii. Object recognition, essential to ...
Journal of Comparative Physiology
Big brown bats (Eptesicus fuscus) use biosonar to find insect prey in open areas, but they also f... more Big brown bats (Eptesicus fuscus) use biosonar to find insect prey in open areas, but they also find prey near vegetation and even fly through vegetation when in transit from roosts to feeding sites. To evaluate their reactions to dense, distributed clutter, bats were tested in an obstacle array consisting of rows of vertically hanging chains. Chains were removed from the array to create a curved corridor of three clutter densities (high, medium, low). Bats flew along this path to receive a food reward after landing on the far wall. Interpulse intervals (IPIs) varied across clutter densities to reflect different compromises between using short IPIs for gathering echoes rapidly enough to maneuver past the nearest chains and using longer IPIs so that all echoes from one sound can be received before the next sound is emitted. In high-clutter density, IPIs were uniformly shorter (20-65 ms) than in medium and low densities (40-100 ms) and arranged in "strobe groups," with some ...
Animal cognition, 2014
Object constancy, the ability to recognize objects despite changes in orientation, has not been w... more Object constancy, the ability to recognize objects despite changes in orientation, has not been well studied in the auditory modality. Dolphins use echolocation for object recognition, and objects ensonified by dolphins produce echoes that can vary significantly as a function of orientation. In this experiment, human listeners had to classify echoes from objects varying in material, shape, and size that were ensonified with dolphin signals. Participants were trained to discriminate among the objects using an 18-echo stimulus from a 10° range of aspect angles, then tested with novel aspect angles across a 60° range. Participants were typically successful recognizing the objects at all angles (M = 78 %). Artificial neural networks were trained and tested with the same stimuli with the purpose of identifying acoustic cues that enable object recognition. A multilayer perceptron performed similarly to the humans and revealed that recognition was enabled by both the amplitude and frequenc...
The Journal of the Acoustical Society of America, 2006
The focus of this study was to investigate how dolphins use acoustic features in returning echolo... more The focus of this study was to investigate how dolphins use acoustic features in returning echolocation signals to discriminate among objects. An echolocating dolphin performed a match-to-sample task with objects that varied in size, shape, material, and texture. After the task was completed, the features of the object echoes were measured (e.g., target strength, peak frequency). The dolphin's error patterns were examined in conjunction with the between-object variation in acoustic features to identify the acoustic features that the dolphin used to discriminate among the objects. The present study explored two hypotheses regarding the way dolphins use acoustic information in echoes: (1) use of a single feature, or (2) use of a linear combination of multiple features. The results suggested that dolphins do not use a single feature across all object sets or a linear combination of six echo features. Five features appeared to be important to the dolphin on four or more sets: the echo spectrum shape, the pattern of changes in target strength and number of highlights as a function of object orientation, and peak and center frequency. These data suggest that dolphins use multiple features and integrate information across echoes from a range of object orientations.
The Journal of the Acoustical Society of America, 2004
ABSTRACT Echolocating dolphins extract object feature information from the acoustic parameters of... more ABSTRACT Echolocating dolphins extract object feature information from the acoustic parameters of object echoes. However, little is known about which object features are salient to dolphins or how they extract those features. To gain insight into how dolphins might be extracting feature information, human listeners were presented with echoes from objects used in a dolphin echoic-visual cross-modal matching task. Human participants performed a task similar to the one the dolphin had performed; however, echoic samples consisting of 23-echo trains were presented via headphones. The participants listened to the echoic sample and then visually selected the correct object from among three alternatives. The participants performed as well as or better than the dolphin (M=88.0% correct), and reported using a combination of acoustic cues to extract object features (e.g., loudness, pitch, timbre). Participants frequently reported using the pattern of aural changes in the echoes across the echo train to identify the shape and structure of the objects (e.g., peaks in loudness or pitch). It is likely that dolphins also attend to the pattern of changes across echoes as objects are echolocated from different angles.
The Journal of the Acoustical Society of America, 2008
Big brown bats were trained in a two-choice task to locate a two-cylinder dipole object with a co... more Big brown bats were trained in a two-choice task to locate a two-cylinder dipole object with a constant 5 cm spacing in the presence of either a one-cylinder monopole or another two-cylinder dipole with a shorter spacing. For the dipole versus monopole task, the objects were either stationary or in motion during each trial. The dipole and monopole objects varied from trial to trial in the left-right position while also roving in range (10-40 cm), cross range separation (15-40 cm), and dipole aspect angle (0 degrees -90 degrees ). These manipulations prevented any single feature of the acoustic stimuli from being a stable indicator of which object was the correct choice. After accounting for effects of masking between echoes from pairs of cylinders at similar distances, the bats discriminated the 5 cm dipole from both the monopole and dipole alternatives with performance independent of aspect angle, implying a distal, spatial object representation rather than a proximal, acoustic object representation.
The Journal of the Acoustical Society of America, 2006
ABSTRACT Insectivorous big brown bats (Eptesicus fuscus) use frequency‐modulated ultrasonic echol... more ABSTRACT Insectivorous big brown bats (Eptesicus fuscus) use frequency‐modulated ultrasonic echolocation calls to locate and capture prey, often while navigating through highly cluttered areas of vegetation. To test how their calls change while flying through different clutter densities, a matrix of vertically hanging chain links was constructed in a 4.5‐m‐wide, 10.5‐m‐long, and 2.6‐m‐high flight room. Three different clutter densities (low, medium, and high) were created by varying the number of chains in the matrix (9, 114, and 150, respectively). Four wild‐born bats were trained to fly through curved gaps in the chain network. These flights were recorded in the dark with a stereoscopic pair of thermal imagingcameras and a heterodyne bat detector. The bats were flown first through the high‐, then the low‐, and finally the medium‐density configuration over a period of 40 days. Preliminary analysis of the pulse intervals of the bats sounds during flight reveals that the interpulse intervals shorten considerably under high‐clutter conditions as compared to medium or low clutter. The data suggest that bats flying through high clutter become limited to shorter interpulse intervals in order to ensonify their immediate environment at the expense of larger‐scale navigation. [Work supported by NIH.]
The Journal of the Acoustical Society of America, 2013
ABSTRACT Dolphins naturally recognize objects from multiple angles using echolocation. With train... more ABSTRACT Dolphins naturally recognize objects from multiple angles using echolocation. With training, humans can also learn to accurately classify objects based on their echoic features. In this study, we used neural networks to identify acoustic cues that enable objects to be recognized from varying aspects. In simulation 1, a self-organizing map was able to differentiate a subset of objects using only amplitude and frequency cues, but it classified some echoes from different objects as being from the same object. In simulation 2, a multilayer perceptron was trained through error correction to identify objects based on echoes from a single aspect, and then tested on its ability to recognize those objects using echoes from different orientations. Overall, perceptrons performed similarly to trained undergraduates. Analysis of network connection weights revealed that both the amplitude and frequency of echoes, as well as the temporal dynamics of these features over the course of an echo train, enabled perceptrons to accurately identify objects when presented with novel orientations. These findings suggest that learning may strongly impact an organism's ability to echoically recognize an object from any viewpoint.
The Journal of the Acoustical Society of America, 2013
ABSTRACT Object constancy, the ability to recognize objects despite changes in orientation, has n... more ABSTRACT Object constancy, the ability to recognize objects despite changes in orientation, has not been well studied in the auditory modality. Dolphins use echolocation for object recognition, and objects ensonified by dolphins produce echoes that can vary significantly as a function of orientation. In four experiments, human listeners had to classify echoes from objects ensonified with dolphin signals. Participants were trained to discriminate among the objects using an 18-echo stimulus from a 10 degree range of aspect angles, then tested with novel aspect angles across a 60 degree range. In the first two experiments, the three objects varied in material, size, and shape. Participants were typically successful recognizing the objects at all angles (M = 78%). In experiment 3, the three objects had the same material but different shapes. Participants were often unsuccessful recognizing the objects at all angles (M = 46%). In experiment 4, participants had to classify echoes from four fish species across a wider range of angles (330 degrees). Preliminary results show overall poor performance (M = 45%). These results suggest that object characteristics play a role in whether performance is more view-dependent or view-invariant. These studies can provide insight into the process dolphins use to identify objects.
The Journal of the Acoustical Society of America, 2008
Echolocating big brown bats (Eptesicus fuscus) frequently catch insects during aerial pursuits in... more Echolocating big brown bats (Eptesicus fuscus) frequently catch insects during aerial pursuits in open spaces, but they also capture prey swarming on vegetation, and from substrates. To evaluate perception of targets on cluttered surfaces, big brown bats were trained in a two-alternative forced-choice task to locate a target, varying in height, that was embedded partway in holes (clutter) cut in a foam surface. The holes were colocalized with the possible positions of the target at distances ranging from 25 to 35 cm. For successful perception of the target, the bat had to detect the echoes contributed by the target in the same time window that contained echoes from the clutter. Performance was assessed in terms of target reflective strength relative to clutter strength in the same time window. The bats detected the target whenever the target strength was greater than 1-2 dB above the clutter.
The Journal of the Acoustical Society of America, 2007
This study documents the changes in peak frequency, source level, and spectrum shape of echolocat... more This study documents the changes in peak frequency, source level, and spectrum shape of echolocation clicks made by the same dolphin performing the same discrimination task in 1998 and in 2003/2004 with spherical solid stainless steel and brass targets. The total average peak frequency used in 1998 was 138 kHz but in 2003/2004 it had shifted down nearly 3.5 octaves to 40 kHz. The total average source level also shifted down from 206 dB in 1998 to 187 kHz in 2003/2004. The standard deviation of these parameter values within time periods was small indicating a consistent difference between time periods. The average parameter values for clicks used when exposed to brass versus steel targets were very similar indicating that target type did not greatly influence the dolphin's average echolocation behavior. The spectrum shapes of the average clicks used in 1998 and in 2003/2004 were nearly mirror images of each other with the peak energy in 2003/2004 being concentrated where the 1998 clicks had the lowest energy content and vice versa. Despite the dramatic differences in click frequency content the dolphin was able to perform the same discrimination task at nearly the same level of success.
The Journal of the Acoustical Society of America, 2007
Echolocating dolphins extract object feature information from the acoustic parameters of echoes. ... more Echolocating dolphins extract object feature information from the acoustic parameters of echoes. To gain insight into which acoustic parameters are important for object discrimination, human listeners were presented with echoes from objects used in two discrimination tasks performed by dolphins: Hollow cylinders with varying wall thicknesses (+/-0.2, 0.3, 0.4, and 0.8 mm), and spheres made of different materials (steel, aluminum, brass, nylon, and glass). The human listeners performed as well or better than the dolphins at the task of discriminating between the standard object and the comparison objects on both the cylinders (humans=97.1%; dolphin=82.3%) and the spheres (humans= 86.6%; dolphin= 88.7%). The human listeners reported using primarily pitch and duration to discriminate among the cylinders, and pitch and timbre to discriminate among the spheres. Dolphins may use some of the same echo features as the humans to discriminate among objects varying in material or structure. Human listening studies can be used to quickly identify salient combinations of echo features that permit object discrimination, which can then be used to generate hypotheses that can be tested using dolphins as subjects.
The Journal of the Acoustical Society of America, 2011
ABSTRACT Previous research has shown that musical training affects the type of cues people use to... more ABSTRACT Previous research has shown that musical training affects the type of cues people use to discriminate between auditory stimuli. The current study investigated whether quantity of musical training and musical area of expertise (voice, percussion instrument, non-percussion instrument) affected musical feature perception. Participants with 0-4 yr of experience (13 non-musicians), 5-7 yr of experience (13 intermediate musicians), and 8 yr or more of experience (13 advanced musicians) were presented with pairs of 2.5 s novel music sequences that were identical (no change trials), differed by one musical feature (pitch change, timber change, or rhythm change), and differed by two musical features (pitch and timber change, pitch and rhythm change, or timber and rhythm change). In 64 trials, participants had to report whether they heard a change, as well as classify the specific type of change. Participants in the advanced group (M = 91.2%) and intermediate groups (M = 85.0%) performed significantly better than non-musicians (M = 70.0%). There was no effect of area of musical expertise (voice or instrument) on musical feature change detection. These results suggest that musical training in any area increases the ability to perceive changes in pitch, timber, and rhythm across unfamiliar auditory sequences.
The Journal of the Acoustical Society of America, 2000
ABSTRACT Animal behavior experiments require not only stimulus control of the animal's be... more ABSTRACT Animal behavior experiments require not only stimulus control of the animal's behavior, but also precise control of the stimulus itself. In discrimination experiments with real target presentation, the complex interdependence between the physical dimensions and the backscattering process of an object make it difficult to extract and control relevant echo parameters separately. In other phantom-echo experiments, the echoes were relatively simple and could only simulate certain properties of targets. The echo-simulation method utilized in this paper can be used to transform any animal echolocation sound into phantom echoes of high fidelity and complexity. The developed phantom-echo system is implemented on a digital signal-processing board and gives an experimenter fully programmable control over the echo-generating process and the echo structure itself. In this experiment, the capability of a dolphin to discriminate between acoustically simulated phantom replicas of targets and their real equivalents was tested. Phantom replicas were presented in a probe technique during a materials discrimination experiment. The animal accepted the phantom echoes and classified them in the same manner as it classified real targets.