-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathindex.html
343 lines (316 loc) · 24 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
<!DOCTYPE HTML>
<html lang="en"><head>
<meta name="google-site-verification" content="mzyi5h_x1DvmnvvOynztP5h3mA2mdBpUjfhySMB9NNI" />
<!-- Google Tag Manager -->
<script>(function(w,d,s,l,i){w[l]=w[l]||[];w[l].push({'gtm.start':
new Date().getTime(),event:'gtm.js'});var f=d.getElementsByTagName(s)[0],
j=d.createElement(s),dl=l!='dataLayer'?'&l='+l:'';j.async=true;j.src=
'https://www.googletagmanager.com/gtm.js?id='+i+dl;f.parentNode.insertBefore(j,f);
})(window,document,'script','dataLayer','GTM-NS48J6H');</script>
<!-- End Google Tag Manager -->
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<title>Lorna Mugambi</title>
<meta name="author" content="Lorna Mugambi">
<meta name="viewport" content="width=device-width, initial-scale=1">
<link rel="stylesheet" type="text/css" href="style.css">
<link rel="stylesheet" type="text/css" href="stylesheet.css">
<link rel="stylesheet" href="https://cdn.jsdelivr.net/gh/jpswalsh/academicons@1/css/academicons.min.css">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/6.4.0/css/all.min.css">
<link rel="icon" href="assets/imgs/logo3.png">
</head>
<body>
<table style="width:100%;max-width:1200px;border:0px;border-spacing:0px;border-collapse:separate;margin-right:auto;margin-left:auto;"><tbody>
<tr style="padding:0px">
<td style="padding:0px">
<table style="width:100%;border:0px;border-spacing:0px;border-collapse:separate;margin-right:auto;margin-left:auto;"><tbody>
<tr style="padding:0px">
<td style="padding:2.5%;width:63%;vertical-align:middle">
<p style="text-align:center">
<name>Lorna Mugambi</name>
</p>
<p>I am a Research Fellow at the <a href="https://dekut-dsail.github.io/">Centre for Data Science and Artificial Intelligence (DSAIL)</a>, Nyeri, Kenya. My area of research includes Self-supervised learning, computer vision and medical image analysis.
</p>
<p style="text-align:center">
 |  <a href="mailto:lornamugambik@gmail.com"><img src="assets/envelope-solid.svg" alt="Icon" class="icon">Email</a>  |  <br>
 |  <a href="assets/docs/Lorna-resume.pdf"><img src="assets/file-solid.svg" alt="Icon" class="icon">CV</a>  | 
<a href="https://github.com/lornamugambi"><img src="assets/github.svg" alt="Icon" class="icon">Github</a>  |  <br>
 |  <a href="https://scholar.google.com/citations?user=whx9i5gAAAAJ&hl=en"><img src="assets/google-scholar.svg" alt="Icon" class="icon">Google Scholar</a>  | 
<a href="https://orcid.org/0000-0002-0824-5016"><img src="assets/orcid.svg" alt="Icon" class="icon">Orcid</a>  |  <br>
 |  <a href="https://www.linkedin.com/in/lorna-mugambi/"><img src="assets/LinkedIn_icon.webp" alt="Icon" class="icon">LinkedIn</a>  |  <br>
</p>
<table style="width:100%;border:0px;border-spacing:0px;border-collapse:separate;margin-right:auto;margin-left:auto;">
<tbody>
<tr>
<td style="padding:2.5%;height:100px;vertical-align:middle;text-align:center">
<a href="https://dekut-dsail.github.io/">
<img style="height:60px; max-width:50%; width:auto;" alt="dsail logo" src="assets/imgs/dsail_black.png" class="hoverZoomLink">
</a>
</td>
</tr>
</tbody>
</table>
</td>
<td style="padding:2.5%;width:40%;max-width:40%">
<a href="#">
<img style="width:100%;max-width:100%" alt="profile photo" src="assets/imgs/DSC_6312.jpg" class="hoverZoomLink">
</a>
</td>
</tr>
</tbody></table>
<table style="width:100%;border:0px;border-spacing:0px;border-collapse:separate;margin-right:auto;margin-left:auto;">
<tbody>
<tr>
<td style="padding:20px;width:100%;vertical-align:middle">
<heading>Bio</heading>
<p>
I am pursuing my MSc in Telecommunication Engineering at Dedan Kimathi University of Technology, where I am advised by <a href="http://ciirawamaina.com/">Prof. Ciira wa Maina</a> and <a href="https://health.uct.ac.za/cape-heart-institute/contacts/liesl-zuhlke">Prof. Liesl Zühlke</a>.
</p>
</td>
</tr>
</tbody>
</table>
<table style="width:100%;border:0px;border-spacing:0px;border-collapse:separate;margin-right:auto;margin-left:auto;">
<tbody>
<tr>
<td style="padding:20px;width:100%;vertical-align:middle">
<heading>
<font size="5">Recent Updates</font>
</heading>
<ul style="list-style-type:none;padding:0;">
<li>
<div style="margin: 0 20px;">
<em style="font-size:14px;"><b>[July-Aug 2024:]</b> I was invited to be a speaker at the ecology workshop of <a href="https://indabaxug.github.io/index.html"><strong>Deep Learning IndabaX Uganda 2024</strong></a> in Kampala, Uganda. I talked about how we can leverage the use of AI in conservation preservation.</em>
<div style="text-align:center; margin-top:10px;">
<img src="assets/imgs/dlix.jpg" alt="Image 1" style="width:200px;height:150px;">
</div>
</div>
</li>
<li>
<div style="margin: 0 20px;">
<em style="font-size:14px;"><b>[July 2024:]</b> I attended the inaugural <a href="https://www.acvss.ai/home"><strong>African Computer Vision Summer School</strong></a> in Nairobi, Kenya. The summer school entailed an intense 10 days where we had lectures and practicals on dataset construction, advanced architectures for vision, visual representation learning, generative modelling, video understanding, shift, domain adaptation and ethics-ecology. There was also a hackathon where we got to apply lessons we were learning from the summer school and my team won the Research Impact track!</em>
<div style="text-align:center; margin-top:10px;">
<img src="assets/imgs/PXL_20240723_113955730.jpg" alt="Image 1" style="width:200px;height:150px;">
</div>
</div>
</li>
<li>
<div style="margin: 0 20px;">
<em style="font-size:14px;"><b>[June 2024:]</b> I helped organize, prepare material and host <a href="https://www.datascienceafrica.org/dsa2024nyeri/"><strong>Data Science Africa 2024</strong></a> in Nyeri, Kenya. It was a pleasure to work with Dr. Adnrew Katumba on the Computer Vision Session where we worked on the use of Roboflow for data annotation and the use of YOLO v8 to classify, detect and segment images from the conservation and health domains. It was also a pleasure to assist Prof. Justin Dauwels with the Generative AI session.</em>
<div style="text-align:center; margin-top:10px;">
<img src="assets/imgs/dsa2024-cv.jpg" alt="Image 2" style="width:200px;height:150px;">
<img src="assets/imgs/dsa2024-gen.jpg" alt="Image 2" style="width:200px;height:150px;">
</div>
</div>
</li>
</ul>
</td>
</tr>
</tbody>
</table>
<table style="width:100%;border:0px;border-spacing:0px;border-collapse:separate;margin-right:auto;margin-left:auto;"><tbody>
<tr>
<td style="padding:20px;width:100%;vertical-align:middle">
<heading>Publications</heading>
</td>
</tr>
</tbody></table>
<table style="width:100%;border:0px;border-spacing:0px;border-collapse:separate;margin-right:auto;margin-left:auto;"><tbody>
<tr>
<td style="padding:20px;width:100%;vertical-align:middle">
<heading>2023</heading>
</td>
</tr>
</tbody></table>
<table style="width:100%;border:0px;border-spacing:0px;border-collapse:separate;margin-right:auto;margin-left:auto;"><tbody>
<tr>
<td style="padding:20px;width:25%;vertical-align:middle">
<div class="one">
<img src='assets/imgs/gejusta.png' width="200" height="120">
</div>
</td>
<td style="padding:20px;width:75%;vertical-align:middle">
<a href="https://repository.dkut.ac.ke:8080/xmlui/handle/123456789/8422">
<papertitle>Analysis of women representation in STEM in Africa</papertitle>
</a>
<br>
<a href="https://kiariegabriel.github.io/">Gabriel Kiarie</a>,
<strong>Lorna Mugambi</strong>,
<a href="https://kabi23.github.io/">Jason Kabi</a>,
<a href="http://ciirawamaina.com/">Ciira wa Maina</a> <br>
<em>7th DeKUT International Conference on Science, Technology, Innovation and Entrepreneurship</em>, November, 2023.
<br>
<a href="https://repository.dkut.ac.ke:8080/xmlui/handle/123456789/8422"><span><i class="fas fa-scroll"></i></span>Paper</a>
<p></p>
<p>Girls and women have consistently been underrepresented in most Science, Technology, Engineering, and Mathematics (STEM) professions, necessitating research [1]. There is a need to define and execute measures and policies to help reduce this gap [2]. The Centre for Data Science and Artificial Intelligence (DSAIL), in collaboration with Gender Justice in STEM Research in Africa (GeJuSTA), is conducting studies to analyse a the representation of women in STEM in Africa. The study will be used to guide the development of policies and curricula aimed at bridging the gap of women representation in STEM. The methods used in this study are analysing the genders of members of staffs in STEM faculties from African universities; analysing the genders of STEM-papers’ authors from African universities and; conducting literature review to evaluate existing measures that have been put in place to encourage and enable women to join STEM professions. Preliminary results show that women are underrepresented in STEM fields in Africa.
</p>
</td>
</tr>
<tr>
<td style="padding:20px;width:25%;vertical-align:middle">
<div class="one">
<img src='assets/imgs/dsailboard.jpeg' width="180">
</div>
</td>
<td style="padding:20px;width:75%;vertical-align:middle">
<a href="https://ieeexplore.ieee.org/abstract/document/10293682">
<papertitle>The use of Open-Source Boards for Data Collection and Machine Learning in Remote Deployments</papertitle>
</a>
<br>
<a href="https://kiariegabriel.github.io/">Gabriel Kiarie</a>,
<a href="https://kabi23.github.io/">Jason Kabi</a>,
<strong>Lorna Mugambi</strong>,
<a href="http://ciirawamaina.com/">Ciira wa Maina</a> <br>
<em>2023 IEEE AFRICON</em>, September.
<br>
<a href="https://doi.org/10.1109/AFRICON55910.2023.10293682"><span><i class="fas fa-scroll"></i></span>Paper</a>
<p></p>
<p>Machine learning is being adopted in many walks of life to solve various problems. This is being driven by development of robust machine learning algorithms, availability of large datasets and low-cost computation resources. Some machine learning applications require deployment of devices off-the-grid for data collection and processing. Such applications require development of systems that can operate autonomously during their deployment. This paper presents how some open-source boards have been leveraged for off-grid data collection and machine learning. Advancement in technology has seen development of low-cost and low-power open-source boards that can be interfaced with a wide array of sensors for data collection and can perform computation processes. The boards are finding wide applications in data collection and machine learning initiatives. A wide array of open source boards exists in the market. The boards can generally be divided into micro controllers, single board computers and field programmable gate arrays. These boards have different properties in terms of processing capabilities, power consumption, and communication interfaces and features. For off-grid data collection and machine learning tasks, resources such as power and network for communication are limited in most cases. These factors should be considered when choosing boards for off-grid deployment tasks. The boards chosen should optimise the use of these resources while meeting the processing capabilities required for the tasks at hand.
</p>
</td>
</tr>
<tr>
<td style="padding:20px;width:25%;vertical-align:middle">
<div class="one">
<img src='assets/imgs/annotated.gif' width="200" height="100">
</div>
</td>
<td style="padding:20px;width:75%;vertical-align:middle">
<a href="https://ieeexplore.ieee.org/abstract/document/10293724">
<papertitle>Efficient Camera Trap Image Annotation Using YOLOv5</papertitle>
</a>
<br>
<a href="#">Yuri Njathi</a>,
<a href="#">Lians Wanjiku</a>,
<strong>Lorna Mugambi</strong>,
<a href="https://kabi23.github.io/">Jason Kabi</a>,
<a href="https://kiariegabriel.github.io/">Gabriel Kiarie</a>,
<a href="http://ciirawamaina.com/">Ciira wa Maina</a> <br>
<em>2023 IEEE AFRICON</em>, September.
<br>
<a href="https://doi.org/10.1109/AFRICON55910.2023.10293724"><span><i class="fas fa-scroll"></i></span>Paper</a>
<p></p>
<p>Using camera traps to acquire wildlife images is becoming more common within conservancies. The information provided by these camera traps enhances understanding of wildlife behaviour and population patterns. The detection and counting of animals present in each of the captured images is valuable information as it can be used to guide conservation efforts. Manual annotation of these wildlife images is a tedious painful process. It is becoming more common to use tools that either use AI to annotate camera trap datasets or use AI to aid in annotation. These AI tools are usually trained on species endemic to a particular region. The ability to fine-tune such models to species endemic to one's particular region is important to save much of the time conservationists manually look through the misclassified images. In this paper, we present a case study where we used a YOLOv5 object detection model trained to detect the presence and count the number of impala and other animals from a dataset collected by researchers at the Dedan Kimathi University of Technology Conservancy. We analyze the results of the AI's performance with respect to a manually annotated dataset. The model was able to annotate 72% of the dataset at a human level of accuracy. The work here shows promise with regard to time spent labelling camera trap images by leveraging the presence of particular species to auto-annotate a majority of the dataset.
</p>
</td>
</tr>
<tr>
<td style="padding:20px;width:25%;vertical-align:middle">
<div class="one">
<img src='assets/imgs/njath6-ISTAfrica_Paper_ref_110_xplore-small.gif' width="180" height="auto">
</div>
</td>
<td style="padding:20px;width:75%;vertical-align:middle">
<a href="https://ieeexplore.ieee.org/document/10187738">
<papertitle>Unsupervised Discovery of Echocardiographic Views for Rheumatic Heart Disease Diagnosis</papertitle>
</a>
<br>
<a href="#">Yuri Njathi</a>,
<strong>Lorna Mugambi</strong>,
<a href="https://ciirawamaina.com">Ciira wa Maina</a>,
<a href="https://health.uct.ac.za/cape-heart-institute/contacts/liesl-zuhlke">Liesl Zühlke</a> <br>
<em>2023 IST-Africa Conference (IST-Africa)</em>, May.
<br>
<a href="https://doi.org/10.23919/IST-Africa60249.2023.10187738"><span><i class="fas fa-scroll"></i></span>Paper</a>
<p></p>
<p>RHD is a cardiovascular disease that causes damage to the heart valves. If the damage is severe it is rectified using expensive valve replacement surgery. Early diagnosis of the disease allows for cost-friendly preventive measures. Specific views of the heart are required for proper assessment by heart specialists. Since routine screening is recommended for the rapid early identification of RHD, a large amount of patient data is generated. To handle this influx of data, trained AI is being used to automate view classification, unfortunately, the high cost of obtaining expert-labelled data in terms of time and money is prohibitive. Thus, we explore how the use of unsupervised AI methods can aid experts in the faster labelling of the data and what patterns PCA and agglomerative clustering identify in echo videos. We found that after appropriate preprocessing, these unsupervised methods can group videos with similar echocardiographic views. We also found that these methods were sensitive to the specific machines used to acquire the data and therefore care should be taken when applying them to data collected using different machines.
</p>
</td>
</tr>
<tr>
<td style="padding:20px;width:25%;vertical-align:middle">
<div class="one">
<img src='assets/imgs/dsailporinipaper.jpg' width="200" height="100">
</div>
</td>
<td style="padding:20px;width:75%;vertical-align:middle">
<a href="https://doi.org/10.1016/j.dib.2022.108863">
<papertitle>DSAIL-Porini: Annotated camera trap image data of wildlife species from a conservancy in Kenya</papertitle>
</a>
<br>
<strong>Lorna Mugambi</strong>,
<a href="https://kabi23.github.io/">Jason Kabi</a>,
<a href="https://kiariegabriel.github.io/">Gabriel Kiarie</a>,
<a href="http://ciirawamaina.com/">Ciira wa Maina</a> <br>
<em>Data in Brief</em>, February.
<br>
<a href="https://doi.org/10.1016/j.dib.2022.108863"><span><i class="fas fa-scroll"></i></span>Paper</a>
<p></p>
<p>For years, zoologists, ecologists, and researchers at large have been using instruments such as camera traps in acquiring images of wild animals non-intrusively for ecological research. The main reason behind ecological research is to increase the understanding of various interactions in ecosystems while providing supporting data and information. Due to climate change and the destruction of animal habitats in recent years, researchers have been conducting studies on diminishing populations of various species of interest and the effectiveness of habitat restoration practices. By collecting and examining wild animal image data, inferences such as the health, breeding rate, and population of a particular species can be made. This paper presents an annotated camera trap dataset, DSAIL-Porini1, consisting of images of wildlife species captured in a conservancy in Nyeri, Kenya. 6 wildlife species are captured in this dataset: impala, bushbuck, Sykes’ monkey, defassa waterbuck, common warthog, and Burchell's zebra. This dataset was collected using camera traps based on the Raspberry Pi 2, Raspberry Pi Zero, and OpenMV Cam H7. It provides an example of images collected using relatively low-cost hardware platforms. The image dataset can be used in training and testing object detection and classification machine learning models.
</p>
</td>
</tr>
</tbody></table>
<table style="width:100%;border:0px;border-spacing:0px;border-collapse:separate;margin-right:auto;margin-left:auto;"><tbody>
<tr>
<td style="padding:20px;width:100%;vertical-align:middle">
<heading>2022</heading>
</td>
</tr>
</tbody></table>
<table style="width:100%;border:0px;border-spacing:0px;border-collapse:separate;margin-right:auto;margin-left:auto;"><tbody>
<tr>
<td style="padding:20px;width:25%;vertical-align:middle">
<div class="one">
<img src='assets/imgs/rhd-pipeline.gif' width="200" height="100">
</div>
</td>
<td style="padding:20px;width:75%;vertical-align:middle">
<a href="https://ieeexplore.ieee.org/abstract/document/9845657">
<papertitle>Towards AI Based Diagnosis of Rheumatic Heart Disease: Data Annotation and View Classification</papertitle>
</a>
<br>
<strong>Lorna Mugambi</strong>,
<a href="https://ciirawamaina.com">Ciira wa Maina</a>,
<a href="https://health.uct.ac.za/cape-heart-institute/contacts/liesl-zuhlke">Liesl Zühlke</a> <br>
<em>2022 IST-Africa Conference (IST-Africa)</em>, May.
<br>
<a href="https://doi.org/10.23919/IST-Africa56635.2022.9845657"><span><i class="fas fa-scroll"></i></span>Paper</a>
<p></p>
<p>Rheumatic Heart Disease is a cardiovascular disease highly prevalent in developing countries partially because of inadequate healthcare infrastructure to treat Group A streptococcus pharyngitis and thereafter diagnose and document every case of Acute Rheumatic Fever, the immune-mediated antecedent of rheumatic heart disease. Secondary antibiotic treatment with penicillin injections after a diagnosis of Acute Rheumatic Fever and Rheumatic Heart Disease is used to prevent further attacks of Strep A, preferably prior to any heart valve damage. Echocardiographic screening for early detection of Rheumatic Heart Disease has been proposed as a method to improve outcomes but it is time-consuming, costly and few people are skilled enough to reach a correct diagnosis. Machine Learning is an emerging tool in analysing medical images; our aim is to automate the screening process of diagnosing rheumatic heart disease. In this paper, we present a web application to be used to label echocardiography data. These labelled data can then be used to develop machine learning models that can classify echocardiographic views of the heart and damaged valves from the echocardiograms.
</p>
</td>
</tr>
<tr>
<td style="padding:20px;width:25%;vertical-align:middle">
<div class="one">
<img src='assets/imgs/dsailporinidataset.jpeg' width="180">
</div>
</td>
<td style="padding:20px;width:75%;vertical-align:middle">
<a href="https://data.mendeley.com/datasets/6mhrhn7rxc/6">
<papertitle>DSAIL-Porini: Annotated camera trap images of wildlife species from a conservancy in Kenya</papertitle>
</a>
<br>
<strong>Lorna Mugambi</strong>,
<a href="https://kiariegabriel.github.io/">Gabriel Kiarie</a>,
<a href="https://kabi23.github.io/">Jason Kabi</a>,
<a href="http://ciirawamaina.com/">Ciira wa Maina</a> <br>
<em>Mendeley Data</em>, March 2022.
<br>
<a href="https://data.mendeley.com/datasets/6mhrhn7rxc/6"><span><i class="fas fa-scroll"></i></span>Dataset</a>
<p></p>
<p>This dataset has camera trap images of wildlife species from a conservancy in Kenya and their annotation. They are based on the Raspberry Pi 2, Raspberry Pi Zero and the OpenMV Cam H7 devices. The camera traps were deployed in the conservancy from June 2021 to December 2021. We have 6 categories of grazing mammals in this dataset; Burchell's zebra, Defassa waterbuck, bushbuck, Common warthog, impala and the Syke's monkey.
</p>
</td>
</tr>
</tbody></table>
<hr>
<table width="100%" align="center" border="0" cellspacing="0" cellpadding="20"><tbody>
<tr>
<td>
<heading>Tutorials</heading>
</td>
</tr>
</tbody></table>
<table width="100%" align="center" border="0" cellpadding="20"><tbody>
<tr>
<td style="padding:20px;width:25%;vertical-align:middle"><img src="assets/imgs/DSC_0456.JPG" width="180"></td>
<td width="75%" valign="center">
<a href="https://dekut-dsail.github.io/tutorials/tutorial-camera-traps/index.html">Hands-On Tutorial Session, DSAIL-Tech4Wildlife Workshop - November, 2023.</a><br>
<p>Developing a Low-Cost Raspberry Pi-based Camera Trap for Wildlife Detection</p>
</tbody></table>
</td>
</tr>
</table>
</body>
</html>