Digital Forensics Now

Vendor Transparency, Mobile Device Extractions, & Brigs Learns the Difference Between Validation and Verification

November 16, 2023 Heather Charpentier & Alexis "Brigs" Brignoni Season 1 Episode 6
Vendor Transparency, Mobile Device Extractions, & Brigs Learns the Difference Between Validation and Verification
Digital Forensics Now
More Info
Digital Forensics Now
Vendor Transparency, Mobile Device Extractions, & Brigs Learns the Difference Between Validation and Verification
Nov 16, 2023 Season 1 Episode 6
Heather Charpentier & Alexis "Brigs" Brignoni

Send us a text

 We are back with a mind-boggling conversation about our experiences, and the ever-evolving face of digital forensics. We're going to share some personal anecdotes, enlighten you about the changing UNIX epoch timestamp, and even discuss how we cope with the advancing age in this fast-paced world.

In the digital world, knowledge is power. We will reveal an amazing cheat sheet from Cellebrite that will simplify your understanding of extractions and the data that they yield. We’ll also delve into the concept of tool transparency, highlighting the pros and cons that come with it. We’ll help you understand why it's crucial to be informed about known bugs in a tool, and navigate the complex process of bug reporting. We’re going to discuss why it's essential to have multiple tools in your arsenal for data validation, and how manual validation is a must when it relates to key evidence.

As we wrap up, we'll talk about the implementation of ALEAPP and iLEAPP in Paraben and its capabilities to choose artifacts to report on. To add some levity, we'll also share a humorous meme that perfectly captures the essence of the repercussions of failing to validate your digital data. So, prepare to embark on a journey that’s bound to make you rethink everything you know about data extraction and tooling analysis.

Notes-
Scholarship Reminders
-https://www.iacis.com/will-docken-scholarship/

-https://www.iacis.com/womens-scholarship/

-https://www.magnetforensics.com/blog/2023-magnet-forensics-scholarship-program-apply-today/

Cellebrite Data Extraction CheatSheet
-https://www.linkedin.com/posts/heather-mahalik-cellebrite_data-extraction-cheatsheet-activity-7125138491805462528-l5-5/

-https://cellebrite.com/en/episode-23-i-beg-to-dfir-data-extractions-explained-ffs-afu-bfu-advanced-logical-digital-forensics-webinar/

Paraben
-https://paraben.com




Show Notes Transcript Chapter Markers

Send us a text

 We are back with a mind-boggling conversation about our experiences, and the ever-evolving face of digital forensics. We're going to share some personal anecdotes, enlighten you about the changing UNIX epoch timestamp, and even discuss how we cope with the advancing age in this fast-paced world.

In the digital world, knowledge is power. We will reveal an amazing cheat sheet from Cellebrite that will simplify your understanding of extractions and the data that they yield. We’ll also delve into the concept of tool transparency, highlighting the pros and cons that come with it. We’ll help you understand why it's crucial to be informed about known bugs in a tool, and navigate the complex process of bug reporting. We’re going to discuss why it's essential to have multiple tools in your arsenal for data validation, and how manual validation is a must when it relates to key evidence.

As we wrap up, we'll talk about the implementation of ALEAPP and iLEAPP in Paraben and its capabilities to choose artifacts to report on. To add some levity, we'll also share a humorous meme that perfectly captures the essence of the repercussions of failing to validate your digital data. So, prepare to embark on a journey that’s bound to make you rethink everything you know about data extraction and tooling analysis.

Notes-
Scholarship Reminders
-https://www.iacis.com/will-docken-scholarship/

-https://www.iacis.com/womens-scholarship/

-https://www.magnetforensics.com/blog/2023-magnet-forensics-scholarship-program-apply-today/

Cellebrite Data Extraction CheatSheet
-https://www.linkedin.com/posts/heather-mahalik-cellebrite_data-extraction-cheatsheet-activity-7125138491805462528-l5-5/

-https://cellebrite.com/en/episode-23-i-beg-to-dfir-data-extractions-explained-ffs-afu-bfu-advanced-logical-digital-forensics-webinar/

Paraben
-https://paraben.com




Speaker 1:

Hello everybody. Today is Thursday, november 16, 2023. My name is Alexis Briggs-Brignoni and I'm accompanied, as always, lucky to have my co-host, the proficient steed checker of tool performance, the one that checked that. Is this even wrong? The header of the Utah 46 world B with caps? The one and only Heather Charpentier. The music is higher up by Shane Ivers and can be found at SillimanSoundcom. Heather, so happy to be here, let me show people our faces. There we go.

Speaker 2:

Oh, thank you for the great introduction.

Speaker 1:

It's all true and you know it. You're humble like that, but it's true. I lower the volume a little bit abruptly, but it's all good. Hello, everybody that's live. We appreciate you being here and hello to the good folks that are listening a little bit later through the podcast or watching the stream a little bit later. So thank you for being here. So, as always, heather, you got to tell me what you've been up to in the last two weeks or 14 days that we haven't been, we haven't talked, what's going on? What happened?

Speaker 2:

It has been busy. I've been really busy with work. I have been creating my own timeline for a case I've been working on because I can't find a tool that I like the timeline report in. I know we've talked about that, so I have been creating my own timeline by hand and my plan is hopefully I might know somebody who will help me automate that after I'm done creating it all by hand.

Speaker 1:

That's awesome. Can you refer me to that person?

Speaker 2:

I have some stuff that I want that person to do too. Yeah, I'll see if I can find him and maybe I can introduce you to him.

Speaker 1:

I always it's a guy. Okay, I would appreciate that I got some things I need to run by this person. It's kind of sad, right, because all the tools say they do a timeline but then they never do the type of timeline that we need them to do.

Speaker 2:

We ended up doing it by hand, I can create exactly what I want. It just takes a really, really long time to do it by hand.

Speaker 1:

Yeah, yeah, and that's something that we talked about a little bit about last week, right About how reporting looks, and we'll continue to talk about that till you all, vendors, get your game straight, come on.

Speaker 2:

But I also did start learning Python, so hopefully I'll be able to accompany this person in automating my timeline.

Speaker 1:

Well, look at that, not quite there yet. Till that person to maybe do some artifacts for the leaps. I don't know.

Speaker 1:

Right, Right oh yeah, we'll get it done at some point, don't worry. Well, at least me, the last couple of weeks I've been doing things not as hard or involved as you, so I'm going to share with folks so they could see what I've been doing, what I've been up to lately. So the first thing I'm going to show you is a nice picture here of a beautiful skyline and a city skyline as well, like the sky in the city Right. And I was last week I mean this week, sorry in beautiful Bogota, Colombia, beautiful city in South America, and I was there to do some work. And you can see there it's quite expansive Eight million people live there in a city that's way, way, way higher than Denver in the Andes.

Speaker 1:

So I was a little bit short of breath. I'm a creature of the sea, born in an island and living in Florida, the flattest place on earth, so a little bit of a hard time getting used to the atmosphere there, but it was a great experience. Actually. Let me show a little bit of another picture of what I was doing. So I was there and I had the honor and the pleasure to give a few conferences and some talks on different data, forensics topics and law enforcement issues to the Colombian National Police, and you can see me there addressing a group of investigators and detectives and different specialized units. So it was super awesome and obviously the food is great. So you find yourself there, get some pandevono and get some good food. You're never gonna regret it.

Speaker 2:

Very nice. What was the topic?

Speaker 1:

So we're doing some work on extortion type of cases and then a little bit on mobile forensics and some of that parsing of data.

Speaker 2:

So very nice.

Speaker 1:

Do you show the usual stuff?

Speaker 2:

Very nice. You also did a little presentation and for some students in New York Excuse me, I was speaking with Alexis and he's telling me about this presentation he's going to be doing at the University at Albany and I'm thinking University at Albany, that sounds familiar to me, I used to teach there and like at UAlbany in New York and he's like, yes, and he was doing a presentation for the students and actually the student who invited him to do the presentation used to be a student of mine in the mobile forensics class. So I found out and I hijacked the meeting and joined the conversation.

Speaker 1:

So it was. It was great to have you, heather, as always so, and the students had great, such great questions right in regards to how do you get into the field, the career progression questions about you know, we were discussing pretty much how the life or a day in the life of a forensic examiner, so that was pretty, it was pretty interesting so yeah, they seemed to really love it.

Speaker 2:

It was a great, great presentation.

Speaker 1:

You're too kind. Checks in the mail. Yeah, so, and also I had a great time. Thanks for reminding me of that. Yeah it was, it was really good.

Speaker 1:

I love talking to young people and see that interest and that drive that they have, which just tells me that just because I'm older doesn't mean I need to lose it. So it's pretty, always energizes me, you know which, talking about getting older and making sure you're energized as you get older. Another reason that I know I'm getting older is because now the time the UNIX epoch timestamp has changed. So usually it starts with one six and we look for that one six long number, milliseconds or whatever it is right, there it is. And Alice Caithness reminded us again a good friend of the show, when we talk about somebody's tool in last episode, and he made a great post. When you see that 17, whatever, I mean that's a lot of milliseconds there, you know that it's a recent date, so you'll be on the lookout for that. We changed the timestamp. So Willie, geeky folks had some some cake to celebrate the change of the epoch timestamp there, and that's where we at.

Speaker 2:

I sent this around as an email in my office and I may have gotten called and heard a couple of times, so I guess it's not. It's an honor.

Speaker 1:

Thank you and you're not supposed to say thank you, actually I did and I will, everything, will tell you. Call me one, yeah, so, yeah. So timestamps are changing, we move, time moves forward, doesn't stop, and there's always stuff, good stuff to do so and the good things going on in the community. So what's going on in the community lately?

Speaker 2:

So a couple of reminders. We talked on a previous podcast about the a few scholarships that were available. Iasis has a couple of scholarships and the deadlines are coming up, so I just wanted to give a quick reminder. Just wanted to give a quick reminder. So the Will Dockin Scholarship, which is one of the IASIS scholarships, it is for an individual employed by City, county or state law enforcement agency to conduct digital forensics examinations. They can't have more than two personnel assigned to conduct digital forensic examinations. That application is going to be accepted through November 30th, so that deadline is coming up pretty fast.

Speaker 2:

If you want to get your application in, please do so. And IASIS actually has another one, the Women in Law Enforcement Scholarship. That one is open to either sworn or civilian women in law enforcement and it involves a written essay on career goals and how the applicant will benefit from the BCFE computer training for IASIS, and that one is also accepting applications through November 30th. So get your applications in. And then the magnet scholarship is open until December 1st and that one is going to involve an all-access training pass, a one-year license to magnet axiom and there's two categories awarded for that new to digital forensics and an advanced category. So again, get your applications in for those scholarships. All three great opportunities for people in the digital forensics field.

Speaker 1:

And that's one of the most common questions I had yesterday with the students from the university and I have it from folks all around. It's like, how do I get in what type of training and stuff? And this training is expensive. So that's why we appreciate organizations like IASIS and magnet to give out those scholarships. One of the things I tell folks is make sure that you can get your foot in the door having some of the things that you need to have, but certification-wise some of them. The only real way of getting them is either because your workplace sponsors you and they pay it for you, or if there's a work-study program. And I tell people, look into work-study programs. If you're not lucky enough to get a scholarship, you can facilitate a class, and SANS does that all the time. You facilitate a class and then you get a really discounted rate on the certification class and even possibly the certification attempt or test. So yeah, look into that. I'm really happy seeing folks in the chat. So hi everybody, good to have you here.

Speaker 2:

I can see a few of my coworkers in there calling me a nerd. Thanks guys.

Speaker 1:

They do it here, they can't do it at work, so they take advantage of the situation. Yeah. So let's get a little bit technical. So we're going to get our scholarships and let's talk about a little bit of great article that Celebri came out. And I love cheat sheets they're the greatest. Oh yeah, me too. I mean you can substitute knowledge for a cheat sheet, because then you'll be missing stuff, but it's a good reminder or kind of direct our attention of stuff that we need to be aware. So what's the cheat sheet about Heather?

Speaker 2:

So this cheat sheet has a comparison for the different types of extraction. Celebri put this cheat sheet out, so it starts off with the different states that a device may be in I'm just going to remove this banner, hold on one second. So the different states that you may find your device in, whether it be in a cold state or a hot state, so BFU or AFU, and talks about encryption types. But the part I like the best about this cheat sheet is it gives you a comparison of what types of data you'll find in the different types of extractions so logical, advanced logical, full file system and it tells you what type of extraction you're going to find calls in, what type of extraction you might find location data in and some of the more common types of artifacts.

Speaker 2:

I actually actually Alexis and I both were kind of doing this before the cheat sheet came out and we were doing presentations, both for work purposes, and we started this. I would have liked this before I started doing the presentation that I was creating for work, but this is an excellent cheat sheet for anybody in the field to have. It will come in really handy when you're trying to explain to somebody who may not understand the difference between the extractions, if you're explaining it in my case for court purposes, to maybe like a district attorney the differences of what the different type of extractions may get you and why you want to go for that full file system extraction.

Speaker 1:

Oh, that's so important. And I had, you know, some stakeholders in the past and telling me, well, where's this thing? And I'm like, well, it's not here. Well, the other examiner from somewhere else did have it and I'm like, well, I'm pretty sure that person had it because they were having a full file system, but the one that we have right here it's BFU, so it's way more limited than a full file system. What's a BFU right? So this is a good resource to make sure that you can communicate people clearly, like you're saying celebrate should have come this, come up with this before we needed it. So they need to read our minds.

Speaker 2:

Right, right. So why a little bit sooner next time?

Speaker 1:

I know Paul's in the chat. Why aren't you reading my mind, man? Come on, get, get, get on it. No, it's super useful.

Speaker 2:

There's certain agencies that only have access to regular you fed right, so not not you fed premiums. They may only be able to get an advance logical and this sheet. It would definitely come in handy for them to be able to explain why they don't have certain artifacts and why they don't have access to to certain artifacts.

Speaker 1:

Exactly. And, like Troy is saying, you know, even again, we're in the headroom myself. We're in the prosecution side of the business, but folks on the work the defense cases, right. When I provide some discovery now the my counterpart on the other side of the of the aisle there can go to the extent I say, look, this is what we got, this is how is how, this is what this means. So it's a good tool overall for for everybody to have this kind of cheats going on.

Speaker 2:

There's also an episode of I beg to defer that goes along with this cheat sheet. Throw it up here. It's episode 23 and it talks a little bit about this sheet and goes into more detail as well. Data Extractions explained. Everybody wants to catch that webinar. This, this link will be up on our site when we're when we're done with the podcast.

Speaker 1:

And be aware of something right. This is obviously a great, it's a great cheat sheet from celebrate. But other vendors might have additional extraction types that are not in this list for obvious reasons, because there are different vendor right and vendors might have the same type of things and change the names a little bit. So a good, you know, friendly competition right between celebrate and a magnet, forensics folks, magnet after they merge with a great key right now it's all one company and they have. Now they have their own capability of doing extractions right.

Speaker 1:

There is one that's called logical plus and I think we both really familiar with it. Excuse me, that's something that you also need to be aware about, right is it's a. It's like their version of advanced logical I guess that's the best way I can describe it and make those comparisons. You can take your test phone, make a, make different extractions, different levels, have the understanding that cheat sheet is a good thing, of course, to make some explanations, but the real knowledge will come from you to making those comparisons. And Heather's been kind enough to give me some test data that she does with her test devices and I did a presentation from a group of from my agency where I compare. What's the difference between this extraction and this extraction and this extraction? Right, and that's super important. So, looking to logic, if you have like a celebrate tooling and you have a logical plus, look into that as well and see what are the where those differences. Maybe in the future we can do talk a little bit about some of the key differences there. I don't know. Yeah, brainstorming.

Speaker 2:

Yeah, definitely, and it says here Ian and Josh did a follow up to episode 25 with the decoding, so there's another episode there for everybody to check out, that that we'll talk about decoding.

Speaker 1:

Yeah, there's a question Is the cheat sheet openly available for download? And the answer?

Speaker 2:

is yes, that link will also be up. I have it here and that link will also be up. It's also on Heather Mahalek's LinkedIn page, if you're connected with Heather Mahalek on LinkedIn.

Speaker 1:

Yeah, so the folks that are listening, we're gonna put, obviously, as always, all those links are gonna be in the show notes so you can just copy paste them and get whatever you need. So pretty good stuff, more. We need more of these. So keep oh what not always remembered. Also, if you like a celebrate product user, they have a whole bunch of more cheat sheets in the portal. There's one.

Speaker 1:

One of my favorites is the one about locations, and they cannot not the kind of they actually go and analyze. Okay, there's this location data source and what's the level of accuracy, and that's something that we need to be aware of as examiners. Data sets will give you some data and that's fine, but that data has degrees of certainty and we have to be careful with that. Some, for example, your location artifacts might record your position second by second. Another one might not. Another one might aggregate your location from multiple points and there's a level of how not accurate, but how precise those readings are and you're supposed to where they happen.

Speaker 1:

So they have a good cheat sheet that explains that there are some location artifacts that just suck. You don't rely on them, period. You cannot rely on them. You can use them as a okay, so it's recording these things as good to know to kind of orient your analysis. But you can not use them as an evidence point because they're not accurate or not trustworthy in regards to when they were recorded or whatever it is. So the cheat sheet has some of that and maybe you should do again another comments on that in the future.

Speaker 2:

Yeah, well, just to add to that too, if you're gonna go get that location cheat sheet, follow it up with a visit to Ian Wiffen's blog, doubleblack, because he has all of the test data that goes along with that cheat sheet and how he came to those conclusions, right on his blog, and it's excellent test data.

Speaker 1:

Absolutely, absolutely, and I appreciate Sellerbyte's efforts in to providing these resources and because it also tells us a little bit about what the tool does. So it provides a level of transparency. It says, look, we find locations or we're doing these extractions. This is what this means, this is what the tool's doing behind the scenes and it provides some level of transparency. And I have a lot to say about that because I appreciate that and, again, that's a good thing that Sellerbyte does and other tools do. But stepping back and looking at the whole field of tool makers, which obviously we depend on, transparency is a thing that I think we should ask for more right.

Speaker 1:

And lately Heather and myself have this discussion about doing some tooling. Doing some tooling, testing, verification of the tooling, and it's interesting because we are so used to just blindly taking whatever the tool says and pushing it forward right, and sometimes that's not the case and we say it and people hear it and they don't really oh, yeah, sure, whatever, I need to verify that. Sure, sure, sure. And we don't. And that's why there's the big sign here next to me says verify. And I wanna make a quick point on why is that important? Usually right If something happens to the tool that breaks the tool, we're gonna hear about it. Right? If something that might involve a vulnerability, that it might change or affect how accurate the tool is, we'll hear about it.

Speaker 1:

But little things that might not affect that usually we don't hear about it. For example, let's say you have a report and your report is providing data, and the data is not inaccurate, it's accurate. But you were expecting, let's say, 20 items of that data and you got five. And you're like why am I seeing five when I select the 20, right, again, this is a made up example, right? Or there's this process and the process is correctly shown on your tooling, but when you're gonna go out and output that to provide it in a report, then the report is missing something or having duplicates of the thing, and I'm like I don't see any duplicates here. Why do I have duplicates over here?

Speaker 1:

And again, it's not affecting the accuracy, like, the data's not wrong, it's true, but it's not behaving appropriately. It might be a bug in it. So the question is is this something that the tool makers should be making us aware of? Are we in favor? Or transparency at that level? Why yes or why no? So, heather, what do you think? Is it something that that vendor should be letting us know when stuff like this happens, or what's your thoughts on transparency?

Speaker 2:

So I want to know when there's a major issue in the software, whether it's a parsing issue or a reporting issue or something else altogether, especially if the fix isn't coming immediately. I might choose another tool to use if I know there's an issue. But honestly, you should be using more than one tool anyway. You should never be using one tool for analysis. But if I know already that there's a known issue in a tool, I may choose a different route for analysis completely. So when I'm deciding which tools I'm going to use for analysis, if there's known bugs and I know something is gonna take me extra time because there's already issues, I might choose a different route. The time it takes, the time it's going to set me back right, it's, I guess.

Speaker 2:

So if I'm working on a case and I find a bug and I'm trying to troubleshoot it and it's taking me that time to figure it out, I need to figure out is it me? And of course I always think it's me first, is it user error? And then I need to figure out what did I screw up? And then I figure it out. It isn't me, I didn't screw anything up. And then it takes extra time to go back and forth with support to figure out that it's a known bug. And now I have to wait. So all of that extra time that it has taken me to go through all of these steps. If it was a known bug, I just wish I could have gone somewhere to find out this was a known bug and I should have not used the tool for what I was planning on using it for, I guess, if that makes sense, no it does you're saying?

Speaker 1:

this reminds me of the Taylor Swift song. Right, it's me. I'm the problem, it's me.

Speaker 2:

Right, right. So initially I think what did I do wrong? Every time right. But, if there's somewhere to go and just see. Okay, so there's these known issues with this tool. Right now I'm not going to use it for that right, so I can use a different tool for that function.

Speaker 1:

Well, yeah, and you're saying right, first you figure out. Is it me right? And if I don't think it's me, how do I know? Is it the dataset? Is it the actual tooling? I mean, that is usually a time consuming process. So you're in favor of transparency. You have a list of things for that, right? Yes, Like, why would anybody say no to that?

Speaker 1:

Well, I'm gonna put my devil's advocate hat in a second, but all right, so there's time involved in that right. There's time involved in that right. The verification is a bug because even if you think it is, the vendor is the only one that can confirm that to you and again that you go oh. Another point, and actually let me make two points here based on what you're saying. The first one I wanna read from all the chat. In the chat he says Abraham says report issues early because they have impact in your case and your reputation as an expert witness. And that's true, right.

Speaker 1:

But, you don't wanna figure out that there's something wrong with that output when you're sitting on the stand or whatever it is, and sometimes the output is the output. Like I said, you run the tooling and you have to, as part of the discovery, you have to provide that output. You wanna make sure that your stakeholder knows hey, this output, it looks the way it looks because there's a known issue. But that's fine, right, we can work around it. So Abraham makes a good point on making that. No Transparency, is it good, is it bad? You have any other points in favor of transparency before I put my devil's advocate hat, so I mean.

Speaker 2:

So the saying always is trust the tool, but validate right? I don't know, I guess, why would you start off trusting? I take the quote from Alex from CCL Solutions. He recently said this to me because he has a poster in his office that reads the default position is to distrust the tool. If you're doing that, it gives you, as the examiner, the power to validate and prove that the tool is working. So I wanna start with distrusting and then I want to personally prove that the tool is working. I love that quote from him. So I guess Adam's comment right there validate, validate, validate is the answer. But when it comes to the transparency, I just think life would be easier if I went in knowing what the known bugs were.

Speaker 1:

And I'll be straight with you, buddy, I'm always I feel myself being more in the camp of transparency than not. I do like to also try to see the other side right and I say well, if I'm a vendor, right, what would transparency at that level when I'm saying, look, I spent as a vendor, again my vendor hat. I spent $20 million a year giving promotions, right and advertisement of how awesome my tool is, and then I'm gonna undo it with a list of known issues that I haven't had time to fix or whatever. It is right. Do I wanna do that, right? I guess the answer is no, and it's a tough proposition because let's be real here.

Speaker 1:

And then this is my me, my devil's advocate hat here in trying to understand companies are here to make money. That's what they do, it's a business right, and they have to not only provide us a service, they have to compare themselves with other vendor providers in the field, because they're competing against each other, they're competing for users. That's just what it is. That's the reality of it. So let's say, if I'm a vendor and let's say, well, there's a requirement across the field, everybody needs to disclose known issues the moment we get them. Then that's a fair field. Everybody has to disclose them. They're mandated to do so, but if they don't, they are not.

Speaker 1:

Why would they do that? Right? It might give the appearance of them being at a disadvantage with a competitor, I guess. And again, do I agree with that logic, noah? I want you to tell me so a possible solution as well. What we're gonna do is we're gonna put it on their release notes, right, and we put on their release notes, and there's the knowledge there that nobody will read. Do you read release notes, heather? Do you read them?

Speaker 2:

Once in a while, not very often honestly.

Speaker 1:

I appreciate your honesty Not very often.

Speaker 2:

Well, if I put in a bunch of support tickets and I'm looking to see if they've been rectified. I'll read the release notes, I guess.

Speaker 1:

Did you fix this issue that I have?

Speaker 2:

If not, I don't care, right yeah.

Speaker 1:

I mean, most people don't read their release notes, and this is the thing with their release notes. Right now, I'm getting my devil's advocate hat off and putting my advocate hat on Release notes. Other than people don't read them. Sometimes they're really opaque, right, there's an issue and they only let you know that they solved it. They don't let you know what the actual issue was at any level of detail, and that's important. It's important for you to read your release notes, one of the things that I think folks should do in their labs, or even if you have a lab that's multiple examiners, then have a person assigned to read their release notes and then make everybody else aware of what they should be aware of. If there's something important there, because most people don't read it, and that's it. So have a person that does that. I mean some of the ISO 17.025, right for performance verification, testing and calibration. You need to be aware of that. So that's one way of making sure that you're compliant with your accreditation. So read those release notes.

Speaker 1:

But again, I have the little bit of an issue with release notes not being as transparent as I would like them to be either. Let me put my doubles advocate ahead again. So they're competing right. And another issue is let's say, for example, you have a running list that's public of issues that they have, right. Could one of those issues be misused by a bad actor to harm the tooling or introduce you know really big issues on the tooling? I mean, I understand that right. Let's say, the tooling is vulnerable to a particular exploit in a library that they have. Well, you might not want to make that public until it's fixed, or at least tell people, hey, look, make sure that you mitigate some issues this way, but you don't pull it out that way. So it's a big debate. Some folks do believe that that should be set immediately and it's a debate on that part. So Kate Kane, a great examiner that I had the pleasure of knowing pretty well, she says I love the quote that if you do it first, you do it worst and history will thank you for it.

Speaker 1:

Verges and consumer need to realize that a bug is in a weakness and that's such a great transparency pro point that she's saying and yeah, no, it's absolutely true, it's a beginning, an introduction. It's saying, hey, I recognize this artifact and they need to parse it and we're working on it, where we're only this far and that's so true. And again, it's kind of tough for me to, for example, from my perspective because I'm not a vendor, right Maybe a whole bunch of other issues that I don't know about in regards to the legal issues, in how you disclose things at a what level. I understand I'm going to get now in my little soapbox. I understand that certain things you might not want to disclose, but, from my perspective, only if you're going to fix them really quick.

Speaker 1:

Okay, if you have a bug and you have done this, close it yet because you're going to get a fix out, then that's fine. Disclose it to me and give me the fix all at once, right. But if you're going to then have a bug in your tooling and then leave it sitting there for six months to a year and then when I tell you about it, to me, oh yeah, it's known. And I'm like okay, are you doing anything about it? But if you can, because it's not critical, you have something that you queues. More important, I believe that it should be this close. So people can either mitigate that bug or, like Heather was saying, find out all the recommended ways to get the output in the way that would require it.

Speaker 2:

If they were fixed a little more quickly. I can agree with that. Sometimes it takes a really long time on some of the issues and I think that's part of my thoughts on why I want to know about them is sometimes it's a really long time.

Speaker 1:

Yeah, I mean we all understand we'll stretch thin and development cycles. They have a process right. It's not simple Like I'll just fix it and then it's done tomorrow. I understand it's not the case. I do open source tooling and I wish I had 10 more people like Johan, which we'll talk about him in a second and like James and like Kevin Pagano some other folks but we don't, and so I try to put empathize with where their vendors coming from. But I think they have to make or, like Kate's saying, be upfront about it, it's not a weakness Make people empower people by letting them know what's going on, or just make sure you have the cycles to be able to fix those quickly and then let us know as it's resolved. So I guess my double advocates had a was not really well put in my head.

Speaker 2:

I mean ultimately, though, at the end of the day, it's on the examiner right. So I mean, if there's an issue in the tooling and you find it and you figure out, it's not the user error, which is what I usually think. You have to figure out what to do about that in your own case and figure a way around it and how to report the data in an efficient way for your case.

Speaker 1:

Oh, my goodness, you hit the nail on the head. I'm going to just echo what you said At the end of the day, it is about you as an examiner, it's not the tooling. Josh, a good friend that happens to work first of all, right, but he's a good friend of mine from before he we were talking about this some time ago he told me that how you do your exams is just as important as the artifacts, the tools you use and all the technical stuff. Right, the process, what you do, is as important as all those other things, and that's absolutely true. You need to make sure that the exam is done by you, not by the tool, by itself, by you. And I hate it when people tell me and I know you're with me, heather, well, I don't have time for that, I don't know who's going to start looking at those. Well, you, you are the one you should. Yes, yes.

Speaker 2:

For your new training class and somebody says when am I ever going to use this? You're going to use it.

Speaker 1:

Yeah, no, absolutely. Let me put some of your comments that came in like Troy is saying, like an example is Microsoft right, they sit on issues, and it's again, it's not only the and Troy is making a great point it's not only the Deuter forensic software field, right Vendors they have across all software industry. As you know, software as a product that happens all the time and suffers, becoming even every day more mission critical. So, absolutely, and Brett Brett again, I'm so happy to have Brett around known and expected behavior of a tool is much better and easier to explain than a non-disclose bug. Absolutely, and because it depends on you, right, you as the examiner need to say, look, we run some tooling and this is what it says. And but what actually is this? The tool doesn't validate me, I validate that tool, so you got to make sure that.

Speaker 1:

And again, it's a challenge. It requires you as an examiner to up your level. And, look, I understand that you can go and do this type of verification of the data on every single thing on your case. I understand that If not, you'll be working a case for the rest of your career, only one case. But if you have one, two or three main points, do that and for the rest, do a sanity check, okay, and if you find a bug in the system and that item is not mission critical, then there's no point of having it, or no critical to your report. There's no point in having that data set there. Then they take it out and do a fully streamlined report that addresses the questions that you will to validate whose key items are verified as correct. What do you think, heather? Am I on the right track there?

Speaker 2:

Yeah, definitely. And one other comment too report the bugs when you find them. I don't know how many times I've reported a bug and a company will say to me nobody else is reporting this. And I'll be like how can nobody else be reporting this? So if you find them, report them. Don't just move on, I have to report them. Nothing will get fixed.

Speaker 1:

And I think the issue is that it's not that people are not finding them, they find it, but some of the reporting process is a little bit convoluted. You have to go to this website and then you have to log in and then you have to click on I wanna do a report and then say what type of level it is, I don't know, it's just a bug, right? And then you have to type the thing. You have screenshots, and then there's this process and people. So maybe it's something that. How do we, how does a software developers be included? How do we capture that data to make it easier for the users to do it? Right, and this field is hard, right, you cannot really automate any of that because our tooling runs on standalone networks that are not connected to the internet. That's a standard procedure, right? So it's not like the vendors can simply just do what's that called metrics Like a remote? Yeah, they're like a remote metric collection.

Speaker 1:

There's a telemetry, that's the word I'm looking for. It's not like you use telemetry to figure out what's happening. So it's on you and the examiner to make sure the vendors know. So they can, they can solve those issues. Since you brought the topic, I'm gonna say something else. It's also on you examiner, if you find a tool that's not doing something that you wanted to do, to also tell the vendors. Because again I talked to some vendors and say, look, this will be, it's a thing that people need, they should, you should support it, put it ahead of the queue. And they're like nobody's asking for it, like okay, well, but you still need to do it right? Maybe they don't know they need it.

Speaker 2:

Yes, if you want something, definitely report that you want it. Yep, absolutely so, there's a question Is it common in this field to use one tool to test and another tool to confirm the results, or is it more of you use one tool and get a result than validate those results through another process?

Speaker 1:

Oh, what a great question here. Go ahead.

Speaker 2:

I mean, I use at least two tools on every single case and I'm using them for both. I would say so. I'm using them to confirm results in each other, but I'm also using them because each tool has different capabilities, right? So they have different capabilities to parse different artifacts. So I would say both. How about for you?

Speaker 1:

Well, I mean, I think I agree with you both. But the reason I would say both is because when it says, validate those results through another process, well, guess what? The other tool is another process, right? Because the other tool is made by different developers using a different methodology or way of doing things. So it is valid in with two different processes, aka the two different tools.

Speaker 1:

Now, there's a lot to unpack there on that, right, you still need to. It depends If it's your key evidence. You have to put eyes on it. And when I say eyes, it's the manual work, right? If this is the smoking gun bites, the smoking gun bites, I want to see those bites with my hex editor or a hex viewer. Does that make sense? I want to make sure that this is done. Let's say well, this secb file has this particular entry for a conversation that was deleted, right, that's in iOS. I will get that secb file out for that particular chat and look at it myself and trace back how that came in to make sure that this is being accurately portrayed. That's why, in tooling and we discussed this before it's important for tool makers to give me traceability.

Speaker 1:

I want to make sure that while you're showing me on your tooling or your report. I can find it where it is right. Well, it's on this path, in this location. If it's a SQLite database on this table, on these columns right, to make my job easier to be able to backtrack it. And tools, for the most part, are somewhat good at it. And I say somewhat good at it because they may have a report you know records and fields for a database but they don't give you the query right, they don't tell you how they actually made that query on the database. So they give you the tables and the fields and the records, but how was that built? So again, there's gonna be some elbow grease that you have to apply to that to make sure you verify that data and that's your process yourself and then run other tools. That's the, I think, the easiest or the way most people will do it. That's a great question.

Speaker 2:

Yeah, definitely.

Speaker 1:

But yeah, at the end of the day look at the end of the day you could do a full exam. It'll take you a long time, but you can do it all by hand, like just by looking at the files and doing the analysis by hand. I don't recommend it, but you could in theory. Right, but do that for the Definitely. Use the tools. Yeah, absolutely, which. Talking about tools, as always we've got to discuss some tooling that came up recently.

Speaker 2:

Ah, yes. So we recently had the chance to try out Parabin and Parabin's capabilities to run ALEEP and ILEEP, so I'm gonna just share a few slides. I'm not gonna run the tool live because it just takes a little bit of time and we just don't have the time to actually run it live. So let me let me just show you a few slides of the process to bring a full file system in and process ALEEP and ILEEP through Parabin. So I'm gonna just full screen this here.

Speaker 2:

So this is what the interface looks like in Parabin 3.7, which is a new release. And to bring the extraction in for the ALEEP and ILEEP functionality, you choose the import data and name your case I just named it test and you have the options here to choose what you wanna do with the extraction that you're gonna bring in the extracted data. I chose the iOS analyzer for ILEEP data, but there's other options. You can choose the ADB backup, the Android analyzer for the ALEEP data, celebrate UFED data, gray key data and others, and then you have the option to bring in your archive file. So I brought in, of course, josh Hickman's iOS extraction. I brought in his iOS 15 extraction.

Speaker 1:

Which is publicly available, and we'll make it put in the notes for everybody. If you don't have it yet, you should.

Speaker 2:

Yes his test data is publicly available. Anybody can download it and then set the tool off and process the extraction with the ILEEP option. So it processes and then finish and I will actually share here. Let me remove, I will share the interface.

Speaker 1:

Yeah, and it's pretty for the folks that are listening. It's pretty straightforward process. You do those selections and then the type of reporting that you would like or how you would like it to look or to see and then you can pull that in, which is what Heather is now bringing up on screen.

Speaker 2:

So this is the actual tool with the extraction all processed and let me just zoom in here and you can see here that it parsed the iOS extraction for the ILEEP data. So this is the data that you would see if you parse this extraction in ILEEP. But it's in the Parabend tool and the cool thing about running it in Parabend is you can actually go to the different artifacts and you can choose. You can choose what you want to put in your report, so I can come over here and I don't have to choose everything. So ILEEP, it runs the report and you have all of the artifacts in your report automatically. With Parabend. You can come over and you can choose just a couple if you want to.

Speaker 1:

And that's so useful. Again, talking about tool makers being transparent, we'll all be transparent with everybody, the least by themselves. You don't have the ability of select unique things Like make a report only you selected items. That capability does not exist in my tooling yet, but Parabend provides you a way of doing that, of saying, okay, I'm gonna parse it with the ILEEP through Parabend and then from that ILEEP generated data I wanna pick, like Heather's doing, this item, this item, this item, this item Then select a selection of those to then report on Correct.

Speaker 2:

Correct so. I'm going to just back out of here and then you have the function to create a report. Let me just stop sharing here, and I have a report I will share.

Speaker 1:

And it has multiple report types HTML and I think PDF as well. Right and some others.

Speaker 2:

It does. I have a little slide of that, so here's the different options for the reports. Can I actually-?

Speaker 1:

Yeah, let's read a few here for the folks that are listening. It has a mobile evidence timeline report, a PDF report, excel, html, simple text, simple RTF, csv. Of course you have to try them to figure out which one might be best suited for your use case.

Speaker 2:

So I've ran an HTML report and I'm just going to share that real quick.

Speaker 1:

Yeah, so reporting is so important. Being happy about it for a couple of weeks.

Speaker 2:

I'm going to try and share it here.

Speaker 1:

Yeah, we were talking about a few weeks ago how vendors really draw the ball in reporting lately. There we go.

Speaker 2:

And this is what the Paribn HTML report looks like.

Speaker 1:

Okay, so those are all leap artifacts, correct?

Speaker 2:

These are all leap artifacts. Yeah, I just went through and chose a few from a few different categories. The address book there's some contacts in here and some Bluetooth devices.

Speaker 1:

Look, I mean I'm really happy that Paramount has been kind enough to add that community tooling to it. So that's a pretty good thing, I will say. And then this is a critique that I do with every vendor, every report. There's so much white space on the screen. Maybe we could kind of consolidate a little bit of that and those records and those fields and make a little bit better use of the space. That's a big pivot of mine. We're reporting. Don't give me, let's say, 15 items in two pages. You can give me those 15 items in a quarter of a page, of a single page. So a better use of that white space would be nice. So again a constructive criticism there, but again applies across the board to all vendor reports.

Speaker 2:

I have to agree with the reporting. I am a big fan of being able to choose the different ILEAP and ALEEP artifacts, though I really like that feature.

Speaker 1:

No, and maybe it's something that, moving forward, I need to also try to integrate into the tooling itself, but I do appreciate Parabend and some other vendors embracing open source community tool work.

Speaker 2:

Yeah, it's really cool that they added it in.

Speaker 1:

Yeah, it gives another way of easily getting the tooling to run on a particular system. So a big issue that I have to say big, but an issue that my tooling has is that it doesn't ingest physical extractions, thebin files. And for those that don't know, it's like, well, you have pretty much everything from that phone in a physical dump. Okay, and that's not common anymore. Modern iOS devices, android devices you get a full file system because and because we discussed this already in the past right, because file based encryption, so you can only get that. You don't get a full file system dump. So if you get one, my tooling will not ingest it.

Speaker 1:

Now, parabend now says Parabend does support those type of binary dumps and it can ingest it and then make it understandable to the leap that you need and do that for you. So that's a good thing that it does Autopsy. That's the same thing, kind of leveraging the fact that they know how to read bin files or straight up full physical dumps to then pass it over to the tooling for analysis. So that's a good benefit there. If you have an Android full I keep saying full file system. If you have an Android physical, see it, full file system is so common that I asked. My mouth just says full file system. Troy is asking does it provide? Let me show you here does it provide search ability of all artifacts? Have we tried that?

Speaker 2:

You know, I am not sure if it has the search in the A-Leap and the I-Leap. There's an advanced search. I'm going to try it out for you right now. I'm going to guess that this is a yes right here. Let me try it.

Speaker 1:

Well, and while you're trying it, let me tell the folks so they know what happens behind the scenes. What my tooling does is it creates also tabs, separated value reports. And what third poly tooling does is it runs the tool and then takes those TSVs and then either put them in their interface or do some stuff with it. So it's pretty much a bunch of text files that are presented through the interface that come from the Leaps. Based on that, and the Leaps also provide the tooling let's say, Parabeno, Atopsio, whoever a list of the locations of the artifacts that should be made available to the scripting for processing. So that's how it works behind the scenes.

Speaker 1:

The tooling asks the Leap what do you want? The tooling tells them. The Leaps tell the tooling what they want. They get it. It's run by the Leaps and then these TSV files, these tabs, separated value text files, are then incorporated into the mother tooling if I can use a term for the describe it right the third party tool for presentation to the users. So that's pretty much how it works behind the scenes. Hopefully that makes sense.

Speaker 2:

So I'm getting zero hits for Dfer and I know that's in the Hickman image. However, I also have talked a lot about user error tonight. So I'm wondering if I'm just not choosing the correct options to do the search, because there is a search option here. So I'm assuming I'm just not choosing the right dropdowns. I don't have a lot of experience with Paraben.

Speaker 1:

And even if you did it right the file, you're doing it as a demo. The demo will always get mad at you for trying. Absolutely so it will screw you up even if you're doing it right, so don't worry about it.

Speaker 2:

I'm going to assume that there is a search, because I mean it says advanced search here, but it didn't get any hits on that.

Speaker 1:

Well, I will say that some good folks like Johann and James they're actually they're visualizing a future for the leaps, where some of that capability will be included. So, and it requires me to upgrade my knowledge in regards to programming languages and some other things, and hopefully for next year I'll be able to kind of jump in and add those capabilities which, talking about capabilities, I mean before I jump in, we'll go with the Paraben stuff, right?

Speaker 2:

Yes.

Speaker 1:

All right, good. So let me show you now some of the stuff that's lately being added to the leaps, right before we run out of time. So Johann again I think he's in the chat today he makes some. We wrote some of the scripts that we have in the tooling. One of them is the calendar function and it makes it so, so pretty you have actually some little colors there and it tells you the calendar names, information about those calendars, from what account they come from. A lot of detail that it didn't look as nice beforehand. He also added a little quick feature that I like to the reporting reporting specifically the logging files from the, from the leap tooling. So let me show you that Originally the tooling, when it runs, you know the artifact, looks like this what you see there is.

Speaker 1:

It says you know files, the files that were found, and for what parser. It didn't even have the name of the of the artifact, where it came from. Now they do. And he added additional things. Let me show you now the additional things he added. That's the old version, the new stuff now. I like it a lot. Let me see which one is. It Is the. I think I did not put the picture in. Let me get it real quick so I actually explain it. I'll explain it verbally because we're running out of time.

Speaker 1:

So what he added now is, when you look at the process, the list of files that were processed by the tooling, it tells you what artifact initiated that process, that searching of files, what's the regular expression that found those files, and it tells you the files and how many of them. Well, for the Twitter artifact, it was looking for this expression, regular expression, and it found 20 files. And here's the files. And you might say who cares, right? Well, I'll tell you, developers and myself, we do care, because we want to make sure that we understand if the tooling is actually hitting the files that we need.

Speaker 1:

If it's hitting too much, we need to make sure that we're tidying our search. If it's too many, we need to open it up. If you're looking to backtrack where data came from, that log will help you backtrack with that verification. So, those little things that Johan is adding, it's a lot of value, especially when you need to do deep dives. When you're doing deep dives, you want to have as much information about that artifact as you can to be able to do your verification, do it faster, do it smoother. So I appreciate Johan's work on this. Also, james is doing a lot of work behind the scenes that I hope to release at some point in the future that will benefit everybody that uses using the tool, so good stuff.

Speaker 2:

Yeah, he does awesome work, All right.

Speaker 1:

So you have a question.

Speaker 2:

I'm going to put it up there. Got a full file system and iOS 30 plus days after the deleted images were deleted. Any hope of the deleted images being in photos that SQLite.

Speaker 1:

Well, I mean the photos are not in the SQLite database, right?

Speaker 2:

Yeah, they're not going to be in the database.

Speaker 1:

Yeah. So I guess the question is is the information about those pictures, because, for those who are not familiar, photos of SQLite. It's a database, and there's more than one, by the way, in iOS devices that keeps a lot of metadata information about images within the device and there's one for the pictures that are on the device. There's one well, we're running out of time, so there's many. There's a few of them, right? It's a tough one because my understanding and Heather correct me if I'm wrong with that picture is gone. When the cleanup happens, it takes that data out. Yeah, maybe it lives in the wildfire, maybe if you're lucky, or maybe some notes, but that's a hard find.

Speaker 2:

If you're looking for the picture itself, I'm going to say probably not, maybe going for the cloud data, but it'd have been backed up to the cloud.

Speaker 1:

And I yeah, I think it's Possibly. Yeah, there's a photo of SQLite for all different. I think there's one for even cloud. I need to look up my notes.

Speaker 2:

Yeah.

Speaker 1:

There's a few of them, but I guess the point is this right, If you're dealing with a SQLite database, right, what's going to rule is how the database is set up. Right, what are?

Speaker 1:

the progmas for it right and when it does the cleaning and for a lot of these that data is constantly being updated. So it's kind of especially after 30 days and we're kind of making some informed assumptions here. But after 30 days that's pretty tough. Actually, 30 days is pretty much the limit in anything. Usually in iOS devices you can say well, chats, for example, well, maybe the chat got deleted, but maybe the chats are on the secb files or in the knowledge database if you have an older iOS device. But after 30 days, for sure it's not going to be there, like guarantees are not going to be there. So that's a tough one. Anything that 30 days over in iOS devices as a rule of thumb it's not going to be there. Yeah, you agree with that, heather.

Speaker 2:

Yeah, yeah, it's probably. You're probably not going to get the actual images.

Speaker 1:

Yeah, yeah, that's a tough one, yeah. So one thing I wanted to kind of before getting into the into the meme of the week to close the show I want to talk about. It's about November and I know when. The month of November, but I'm trying to make it the month of the month of no and oh, in caps, november. So and you're like, why are you so negative to hold the whole no? So I did a post today about it, about the whole November thing, and I'm going to show folks here that haven't seen it how it looks. So it's an article that I read up.

Speaker 1:

I read about how no is important and lately I've been feeling a little bit of a burnout I'm not burned out, but kind of like getting there, I guess, and because I want to help out, I want to do things.

Speaker 1:

I like to be busy and do things that are productive, but the fact of the matter is that we can't say yes to everything and we need to take some time to start telling folks no and saying no just from, just not because of saying no.

Speaker 1:

I mean, my opinion on that is you got to prioritize. You can't say no to everything, because not everything is equally important, and I think saying no to things really honors the things that you actually say yes to. Of course, if you have plenty of time to take on multiple responsibilities, then sure, you want to do it, sure. But if time is short, which usually it is, you have to take some time to say no, and that will open also some time for you to be able to take care of yourself, right, and take care of your family and then be effective at the things you actually say yes to, because there's nothing worse than saying yes to everything and then doing a half behind job, adam right. So I rather not do it and then do a half good job, right. So, november, I'm trying to start saying no, and Heather, you know now she has to ask me all sorts of questions Just see if I'm going to say yes or no.

Speaker 2:

He's saying no to me, though he's supposed to say no to everybody but me. I don't know how this happened.

Speaker 1:

Look, I'm going to take that as a progress. It's progress, believe it's progress. So I want to leave everybody with that thought process. Is there something wrong with saying no? Just be aware of what you're doing, be judicious, be present, and some things are better left to do later or for other ones or another person to do. So they have an opportunity to get that. You know, an opportunity to shine and do things. So good stuff Right now. The meme of the week to close it out, and I can share my screen.

Speaker 1:

Yeah, yeah, I always like this is my favorite part of the show. So the meme of the week is showing a person walking and steps on a rake and the rake, the stick, the handle hit him in the face and it says newbie examiners when they don't validate. And then the next image shows this person, like doing a really complex trick on top of the same rake and at the end steps on it and hits him or herself on the face and it says senior examiners when they don't validate. Right, and either way, even your new or old, if you don't take the time to do things right, you hit yourself with that case on the face. And it's kind of funny because I did this meme right, I put it out and I step on the rake and hit my face. And I say that because when I put it out, I was correctly and immediately educated. And let me see if I can find it here. I don't have it here, but in LinkedIn.

Speaker 1:

I was quickly told, probably told that hey, look, you validate processes but you verify the data and maybe what you actually meant, based on ISO standards and all that is verification. And then I'm like you know what? You're absolutely correct. So I'm making the point of making sure I use verify properly and then use validate properly when it needs to be used in the right context. So you know, I mean for most people, validation, verification, it's a difference. We know distinction. I guess we know difference, but it is so, yeah, you have to make sure you validate. I guess it shows been all about validation and so don't, don't drop the ball, don't step on the rake.

Speaker 2:

Yeah, definitely the theme of tonight validate, validate, validate, verify, verify, verify.

Speaker 1:

Yep, all right, so let me stop the screen. Yep, kevin Paganotel says yep, kickflip my face. Plant happens to the best of us, it sure does. Oh, look at that, there we go, all right. So yeah, so anything else for the good or the order, heather, that we're missing or I'm missing?

Speaker 2:

That's it. That's all we got.

Speaker 1:

Well, again, thank you everybody for being here with us. Some new faces that showed up in the chat, so all faces from in real life that we're kind enough to show up to the show. I appreciate your comments. We appreciate your questions. Yeah, absolutely. You can find us on LinkedIn. Obviously on YouTube, and make sure to subscribe to it so you can get alerts from when we go live. We go live every other week, so no show next Thursday. We have to also a social Thanksgiving, right.

Speaker 2:

It is. Yeah, I'm not doing it on Thanksgiving, it's November.

Speaker 1:

Exactly no show. It's yes to Turkey. So thank you everybody, and we'll see you all in the next episode. Take care.

Speaker 2:

Thank you.

Speaker 1:

Bye, bye, bye.

Work Updates, Learning Python, and Scholarships
Tool Transparency and Data Extraction
Transparency and Issues in Tooling Analysis
Using Multiple Tools for Data Validation
Data Extraction and Reporting in Paraben
MEME of the Week