2:05:45
2024-05-20 09:46:48
2:09
2024-05-20 12:30:32
2:41:18
2024-05-20 12:33:23
1:36:58
2024-05-21 08:00:54
5:24:36
2024-05-21 10:06:11
3:24
2024-05-22 06:36:04
9:25
2024-05-22 08:03:05
40:22
2024-05-22 08:14:12
2:49
2024-05-22 09:47:03
1:48:29
2024-05-22 09:50:24
1:57:28
2024-05-22 12:09:49
Visit the Apache Nifi GROUP 2 course recordings page
WEBVTT--> I'm still thinking about it, because we are the guys. --> We're the people that we're building all day, --> we're not running out of hands, we're not running out of hands. --> So we're just going to have to do what we need to do. --> Now, here's the trade. --> We go all the way around the states, --> we're going to have to talk about why we're doing this, --> We'll talk about a lot of things. --> We're going to have to get the stuff out of the cell. --> We're going to have to get the files out of the cell, --> we'll be on a six-year line. --> We'll be on a six-year line to a place we're going to be --> to the cell. --> We're going to be on a six-year line to the cell. --> They gave us the station, which is where we're going to be. --> We're going to be on a six-year line. --> So hopefully, we'll be able to visit that. --> I don't know if you can hear me. --> So, for example, if you find a normal one-hundred-foot-full-tops-of-chopper, --> you can get it. --> So, you can push down the gap time, --> and there's some pain, sores, and speech hangers. --> Then, it comes to the fifth, --> sometimes the fifth, last, and then the fifth, --> or the fifth, once they publish one of them, --> and then it comes to the tenth, and then the seventh. --> I don't know if it's the seventh one, --> but if we go to the tenth, --> that is it, go to the tenth one, --> and then go to the eighth one, --> and then you go to the eighth one, --> and then the seventh is the eighth, --> and then the eighth one would be the tenth, --> or the eighth one, --> and that's the full numbering of the four of them. --> I don't know if you can hear me. --> I'm not sure if you can hear me. --> Oh, no, no, I can hear you. --> I'm not sure if you can hear me. --> I'm not sure if you can hear me. --> I'm not sure if you can hear me. --> Yeah, you're not going to get that. --> I mean, I actually don't make any sense --> in getting to the full of the things --> that you can see in the user parts. --> But, I understand that you're seeing this in our way. --> You can't really get that through. --> But, you can make it use on the lab. --> You can use on the Google Maps. --> You can use on the Google Maps. --> You can send everything out to the end of the web --> and have the people see it again. --> This will be used through the A.O. --> This will be the A.O. --> This is a little bit of a change --> that you can use on the machine. --> You can use it on the machine. --> You can use it on the machine. --> And then you go and you can use it on the machine --> and you can use it on the system. --> So, this is a little bit of a big advantage --> that you can use it on the machine --> and you can use it on the machine --> and you can use it on the machine. --> So, it's a little bit of a big advantage. --> So, I haven't even gone to the computer --> and created a bar from my side --> because this is an issue that I'm not used to. --> And I don't know if I can use it right now. --> But I'm sure I can use it in the future. --> But I don't know if I can use it right now. --> So, that's a little bit of a big advantage. --> And I'm going to be able to fix this --> if I have to. --> But I'm going to be able to get all the --> types of issues out of the way. --> So, this is a little bit of a big advantage. --> I was, you know, I like the starting four --> or starting five, but you could also do, --> I feel like, but no, I agree with the assessment. --> You know, you're not going to get seven, right? --> You know, we just know that's not going to happen. --> DoD is used to seeing five, six, you know, even four. --> So, great assessment. --> But yeah, I think we've laid it out pretty well. --> And then also going to deploy, you know, --> going from seven plus. --> So, no, I'm good with that. --> Okay. --> Okay. --> I saw some of the comments as well. --> So, I pulled open some of the Go team review docs and stuff. --> Yeah, I still don't think it's working for me. --> Tom, one second, let me pull your screen up here. --> So, let's, can you just clear your cues? --> Wait a minute, it looked like it did. --> Oh, I see a fire. --> Go ahead and just clear all your cues. --> Yeah, it does. --> It splits, but it also fails. --> I don't know why. --> I don't understand why it's doing both when I get through this. --> What do you mean? --> There's three files. --> All three fail, but then it also splits all three. --> I don't know. --> Makes no sense to me, but. --> Can we look at that about your key record? --> Before you clear it. --> Well, we know our CSV is failing. --> I shouldn't have conferred. --> Can literally convert this to JSON though? --> It does. --> No, no, no. --> It does convert to JSON. --> Can we look at that CSV, one of those CSVs, and see if they were JSON documents? --> They are not. --> Probably be right here. --> This one? --> No, the failure. --> Well, actually, are those failures or the original? --> Because it may be the originals. --> And the split worked. --> Can you look at the CSV file real quickly? --> It may have actually worked. --> And what you're seeing is the original. --> That's JSON now. --> Okay. --> Because we put failure and original on the same, you know, logging to the same message. --> Oh, gotcha, gotcha. --> You're right. --> We did. --> Yeah. --> So actually, I think it did work. --> And you should be good to go. --> Now you just need to do your evaluate JSON path. --> And. --> Just the attributes on this. --> Yep. --> No, no, no. --> You should have attributes there. --> Because you are a. --> You split so you haven't done the evaluate JSON path. --> Yep. --> So you extend it through that first. --> And then I bet you have attributes. --> Okay. --> Okay. --> Great. --> Thank you. --> Hey, Joshua. --> I got a question on mine. --> Go ahead. --> So I think it worked successfully when I ran it through. --> When I clicked on the get JSON and get CSV processors and had them run once. --> I think it ran through successfully. --> But when I just have them start and just keep running automatically. --> Like thousands of files really quickly. --> And I don't really know where it's getting those files. --> Because they're all referencing this data directory. --> And I only put three files in there. --> Yeah. --> What it's doing is you're keeping the source. --> And it's just sending the same ones through. --> Okay. --> Yeah. --> Yeah. --> If you look. --> Let's. --> Yeah. --> So it's just running the same files through over and over and over. --> And if you. --> Let's stop the get file processor and take a look at it real quick. --> Okay. --> Yeah. --> I was clearing up the queues now because they got backed up pretty bad. --> Oh, yeah. --> Yeah. --> But you're testing the limits of the system. --> Yeah. --> But it is like shortcut keys for this. --> You can actually go out of this group. --> So go back to your main canvas. --> Using the break Chrome or just right click say leave group. --> Click on that group. --> Right click. --> And say stop. --> And empty all queues. --> Okay. --> And that's a good easy way to empty them all. --> And then I bet if you look at your get file. --> Well, you're not keeping the source file. --> What folder uploads the data directory? --> Can you open up that? --> Actually, can you cancel? --> Because you have an error. --> What's the top right? --> You got a little red box. --> Yeah. --> What's that say? --> Unable to write flow file content. --> Content repository. --> Get to archive file size constraints. --> Yeah. --> I think that's we are filling up our cache. --> I'm not too concerned about that error. --> Can you run once in your get file? --> Yeah. --> I need to move the files back. --> So, yeah. --> I mean, if you're having to move the files back. --> And it's picking them up. --> I mean, you shouldn't have thousands going through, right? --> Is there anything in the processed or original files? --> Yes. --> Are you doing a reverse route on your get file? --> Look at your get file again. --> They do two get files. --> Let me look at the other one as well. --> So, there's the one up here that picks up the CSV. --> The other one is picking up the JSONs. --> Yeah. --> And your recursive is true. --> Are you putting files back into that directory? --> Not in the same sub-directories. --> Yeah. --> So, it's picking all your sub-directories up. --> Oh, okay. --> So, it's drilling down and picking up everything. --> Exactly. --> So, it's reprocessing everything. --> Okay. --> That makes sense. --> So, you might want to move it just to a different directory --> or just tell it no on that. --> Okay. --> That's good. --> Yep. --> Thank you. --> Not a problem. --> Sorry. --> My internet just keeps flicking out on my ass. --> But I'm here, John. --> Okay. --> Okay. --> Is there a minimum engineering experience required? --> Because I'm thinking about data engineers, software engineers. --> Yep. --> Exactly. --> Thank you. --> Okay. --> Okay. --> Sure. --> Okay. --> Okay. --> Okay. --> Okay. --> Yep. --> That'll work. --> Perfect. --> Perfect. --> That's called out in the RP. --> Oh, I don't want to mess it up. --> Okay. --> Yeah, you can hear me. --> Oh, I thought I had disconnected. --> There we go. --> Okay. --> Okay. --> Okay. --> Perfect. --> Yeah. --> I like that. --> We call that out. --> Yeah. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Looks really good. --> Yeah, I just want to bounce. --> I can do this. --> I just want to take a look at it and make sure the terminology fits with the previous with the, you know, everything, you know, all terminology has to be the same. --> Yeah. --> Wow. --> Okay. --> Wow. --> Wow. --> Wow. --> Wow. --> Wow. --> Thank you. --> Thank you. --> Thank you. --> Thank you. --> Thank you. --> Thank you. --> Thank you. --> Thank you. --> Thank you. --> Thank you. --> Thank you. --> Thank you. --> Thank you. --> Thank you. --> Thank you. --> Oh, nice. --> So, um, merge content. --> Perfect. --> So if you want to actually, if you want to stop merge content, run your data through, you can look at the queue, you know, to make sure, you know, just to make sure everything looks good. --> And then feel free to just write it out to a different directory. --> And you can then actually view the data. --> Okay. --> Yeah, this is right. --> You either. --> And this is everyone's call, but you can either write it out to a CSV. --> I know we started with CSV. --> The point is, you know, learning the processors, but like you can do attribute to CSV. --> You can create a JSON attribute. --> So you just save everything as, you know, one JSON document or one, you know, each individual documents. --> You know, you would use that for, you know, loading into a database or something. --> If you wanted to save everything as a CSV, then have at it, you know, you could use that to, you know, do Excel or something like that. --> You know, so I'll leave it, you know, up to you. --> But yeah, you can stop it and look in the queue or you can, you know, after you do that, I would just write everything out as a CSV or a JSON document and make sure it looks good and call it a day. --> Okay, sounds good. --> And definitely, let's see. --> Oh, and you got it. --> You got yours for looking pretty good, too. --> You got it cleaned up. --> Okay. Yep. Yeah. --> Most of the process is pretty straightforward over here on the left. --> And then everything's just kind of branching off to the error message. --> Perfect. --> You might want to think about how you would handle better error handling, you know, breaking that up a little bit. --> If you, if you can see my screen kind of run into the same problem. --> And what I did is, you know, to clean this up, I have an error handling on the left. --> That's taking care of half of the data flow, and then I have error handling on the right. --> Let's take care of the other half of the data flow. --> Okay, just to make it look more presentable. --> Okay, great question. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> I was able to export some data. --> I didn't name them properly, but that's fine, I think I can fix that. --> It looks like they were only exporting the last record of data. --> It says hour 23. --> Let's take a look. --> I was actually just doing it. --> I was on mine and building on flow as well, and I was actually just updating the file name when it does write it to save it as a date. --> But okay, let's look. --> So you have just that one record, right, is what you're saying? --> Yes. --> Did you run once or it ran all the way through? --> It ran all the way through. --> I wonder if, can we look at your, can you close that, can you look at your put file? --> Because that's the last one, right? --> Yes. --> All right, let's look at that. --> Go to properties, please. --> I bet it's because it's the same file name. --> It's overriding itself over and over and over and over again. --> Okay. --> Saving each record separately and then saving each record and then saving each new record over the last one. --> Yeah. --> So can you see my screen? --> I was literally just working on that issue. --> So if you're able to see my screen, after I do my, I put all of my attributes to a JSON document and then I set the file name. --> And then what I do is I am --> I'm renaming the file name to a date down to the, you know, down to the milliseconds. --> And then that's how I format it. --> So you may want to, I'll be happy to put this in if you want to try to use it. --> I'll put this in, I can't put it in chat because I can't, chat doesn't work. --> You could also do just file name dot now dot dollar dot now and it will just capture the latest file name. --> But I think it's because the file name is just overriding it over and over and over again. --> Okay. --> Can you, well actually let's test this. --> We can test this. --> Let's run it except for the putting the files to disk. --> So just run it like crazy except the writing the files to your directory. --> Yep. --> And let's let it queue up so we can see what is happening. --> It's not really able to do that right now with the way it's set up. --> It's only, it reads the original files and then moves them. --> It can only run through once. --> So you can just copy them back in, right? --> Yes. --> Okay. --> It seems like the latency is still getting you. --> A little bit, yeah. --> So when you do your survey, don't give me bad marks for that. --> That's not under my control. --> All right. --> So you copied them in? --> Yeah, let me try again. --> I think I forgot to turn off the write. --> If you didn't notice enough, I can move data pretty quickly. --> Yeah, I think it's transferring these files out before I even see that they're there. --> It will. --> It will. --> I've seen folks build a flow and then turn it on and 10,000 files are gone. --> Okay, we got a queue. --> So you got 72 files ready to go. --> Let's list your queue. --> So you got a lot of the same file name, same file name on that one. --> So it should write more than one, but it probably will only write one or two. --> Scroll all the way down a little bit. --> Yeah, it's only going to write three files out because, yeah, three files out even though you've got multiple records for each file. --> So you want to probably do an update attribute and do a file name. --> Okay. --> If you want to do, like, you can use a UUID as well. --> You have it generated UUID. --> Hang on, I can, that way when it writes it, it just writes it as, you can write it as a UUID. --> I mean, it's, yeah, let me, I'll send you the command. --> While you work on that, I'll send you the command to write it as a UUID if you won't. --> Like I said, if you wanted to, let's see. --> Teens chat will work today. --> It won't work because I'm not a member of the chat today. --> What I'll do, Peter, and I can do this for everybody, is I'll just throw a text document in your uploads and you can use it to do your vomiting. --> Okay, sounds good. --> All right, let me exit here.