2:05:45
2024-05-20 09:46:48
2:09
2024-05-20 12:30:32
2:41:18
2024-05-20 12:33:23
1:36:58
2024-05-21 08:00:54
5:24:36
2024-05-21 10:06:11
3:24
2024-05-22 06:36:04
9:25
2024-05-22 08:03:05
40:22
2024-05-22 08:14:12
2:49
2024-05-22 09:47:03
1:48:29
2024-05-22 09:50:24
1:57:28
2024-05-22 12:09:49
Visit the Apache Nifi GROUP 2 course recordings page
WEBVTT--> All right, looks like, well, Darius, you've got yours, maybe, Tom, it looks like you're --> working on it. --> It's zipped inside of a zip. --> Just keep following that folder down and you'll find the data flows, not the sample --> data flows. --> And you're looking for CSV to JSON in the import. --> So you will have the same exact flow that I have. --> Notice it was not a template. --> It's not an XML document, it was a JSON document. --> This is a quirk that I have already found in the enhancement request when I file a --> while back. --> You can do a processor group and import a JSON data flow. --> But if you create a template, it's an XML data. --> It's an XML document, you know, and then we're going to get into registry and --> versioning control, and that's a JSON document. --> So I feel like there is a common way to do this, and we probably should be --> getting rid of templates as, you know, they are configured and go with the JSON --> documents that we can easily share as well. --> So, you know, it's just one of those nuances within NFA. --> So you should have your data flow, should be on your desktop. --> If you want, you can kind of go through. --> You're going to have to fix your Git file. --> You're going to have to fix your right JSON, of course. --> Everything else, you should be working except the convert record. --> If you go to configure on that, you should be able to go to your CSV reader --> controller service, enable it, enable the other services as well as the JSON --> services, and you should be able to run this data flow one time. --> You know, it's only got one CSV file. --> So if you can, see if you can get that to work, and then we will move --> into the scenario where you would potentially use some of these same --> components that we just went over in your own data flow. --> So we'll give that a few minutes. --> I'm going to look at everybody's screen. --> If you have any issues importing it, let me know, but it sounds like, --> it looks like everybody got it. --> I'm curious to see if everyone is able to go to the convert record and --> follow the arrow, enable the services. --> You can enable in service and components if you want. --> It may error out because you have multiple services that need to be --> enabled, but we'll see. --> Well, Darius, you're already in the controller services. --> Great. --> Get those enabled. --> I've already configured them for you. --> So you don't really need to configure anything. --> You just need to enable it and enable the services. --> So see if you can get that, and if you can, see if you can run that --> document through one time, that CSV document and output a JSON document. --> Again, you're going to have to modify your get file and your put file. --> But, you know, it should be rather quickly. --> And you can, again, this is the Windows file system. --> You can make directories wherever you see fit. --> Also, if you are in the Avro schema controller service, you can expand --> that and look at the actual schema that I used. --> You may want to use that to work with the scenario. --> You may want to model your your schema off of that one, just as a hint. --> Right. --> Did anyone have an issue? --> I know some of you are still working through it, but anyone actually get a --> CSV to flow through and output a JSON? --> I did. --> Oh, awesome. --> Did, did it help to see a little bit more of what a controller service does? --> Yeah, definitely. --> I feel very novice about it. --> If I'm being honest, no, and again, I, the, the leap from creating a --> flow to using controller services is a very farly, I totally get it. --> Um, I, I, I have tried and tried to figure out an easier way to bridge that gap. --> And so, you know, what I like to do is just kind of run through my flow --> and then have you doing the exact same flow as well and, and enable the --> services, kind of get the look and feel of things. --> And then the scenario is very closely related to this flow. --> But once you get done with this, the rest is all easy. --> Unless you're wanting to write Python or write your own processor or something --> like that, and you've got to actually do some coding. --> Quick question. --> How do I, how do I tell if it's going through the flow or not? --> Um, yeah. --> So if you do a run once, instead of just turning it on, I like to just run one --> time, I'm looking at your screen. --> Um, you see, wow, you have 4,538 files in the, um, in your queue. --> You see that? --> I don't think I meant to do that. --> Oh, that's okay. --> I would stop the get file though, because you're about to blow it up. --> So right click and say, stop. --> There you go. --> Refresh. --> Just on your main canvas, just come out right here beside, um, there you go. --> Right there and hit right click and go refresh. --> All right. --> So you only loaded 8,000 files. --> So not bad. --> Uh, 10,000 and it would have turned red. --> If you notice it turned yellow. --> Uh, this is a great teaching opportunity. --> So I think yesterday we had someone load, uh, it was Richard. --> Richard was experimenting or something. --> And he had 10,000 files in the queue. --> So the way not five is configured out of the box is once you queue up 10,000 --> files, it will start backing everything up until those 10,000 files are --> processed, but it's good. --> We we've got this. --> Um, so you see the next one is the set schema metadata. --> So that's where we, we, it's an update attribute processor. --> All we're doing is adding an attribute called schema dot name with --> with the value of inventory. --> We're adding that attribute because with a controller service needs --> to know, um, see schema dot name at the bottom inventory. --> So say apply and then right click on set schema metadata and say run once. --> And so refresh, uh, again, somewhere off of the thing. --> And you have that one file now in a success after the set schema metadata. --> See that. --> Okay. --> So let's go to convert record to see Jason. --> Let's go ahead and configure it first. --> And you see, you know, it's coming in as a CSV, right? --> It's going to go out as a Jason. --> So we're using the CSV reader. --> Go ahead and click the arrow and it's going to highlight that record service. --> So you, we are using the CSV record reader service. --> Um, it, you know, if you wanted to, you can go in and look at the --> gear, click the gear, you can look at the, uh, properties. --> Um, basically I'm going to, you know, basically all we're doing it from --> the default is using the schema name property because we can actually, --> there's different ways we can take advantage of the schema access strategy. --> This one is the most simplest. --> And so we're just, you know, updating an attribute, putting --> the inventory name attribute. --> Um, and that's what the schema name is. --> And so we're telling the schema access strategy to --> use the schema name property. --> The schema name property is schema dot name. --> And so this is how we reference that property in NIFA or your --> open curly braces, schema dot name, close curly. --> Um, so I'll say okay on that one. --> And while we're here, let's just look at the other components --> to this so we can just enable it. --> So look at your JSON set record set writer. --> And if you notice it, it's, you know, schema write strategy is --> the set schema dot name attribute. --> So it is the, the schema app, you know, for the CSV reader, --> it was using that schema. --> We're going to use the same schema to write it. --> And so it's schema dot name that just tells it which, um, uh, --> schema to reference. --> So say, okay, okay. --> And let's look at the Avro schema registry. --> The second one, click the gear. --> Okay. --> So the only thing this is, is just a registry of our schemas. --> So we set the schema dot name to inventory. --> So that way, when, when that CSV reader reads that CSV file, --> the, it wants to know what schema to use. --> And when we write it using JSON, it wants to know what --> schema to use. --> So here is the, where you would put that schema in. --> If you click the value, there you go. --> And you see on the right, you can get those, uh, you can --> drag it and expand that box right above. --> Right there. --> There you go. --> Drag it. --> It's a very, very simple schema. --> Um, so it's just, you know, type is record. --> The name is inventory and it has three fields, store, --> item, and quantity. --> And so, um, uh, what it's going to do is, is when it's --> writing that JSON document, it's going to take all of those --> records that are store, put it into that JSON document, all --> the items that are item and quantity, you know, properly --> put those into JSON and format properly. --> So just say, okay, say, okay. --> Again, and then let's look at the Avro reader. --> We have this, you know, we have this just because an --> Avro schema registry for NAFI is very, you know, it's a, --> it's a very easy way to do this, uh, you know, uh, using --> controller services. --> We could have, there is, um, if you click schema access strategy, --> click the use embedded Avro schema. --> Let's see what other options we got. --> See, we can, we can look at, um, the schema dot text name. --> We can use the schema dot name property. --> We can use HWX schema reference attributes, confluent schema. --> Um, um, we can use a confluent schema registry as well. --> If you're familiar with Kafka, uh, confluent has a schema registry. --> We are just using the Avro schema. --> So you can just, you know, click off of that and say, --> okay, just say, okay. --> So, um, you know, if you were doing this where, you know, in, in you --> had hundreds and thousands of flows and you had your own schema, you know, --> Avro schema department that maintained those schemas in a more, you know, --> corporate wide fashion, you, you know, in the use confluent schema registry, --> you can use that registry. --> We're just using the bare mental here. --> So you've got the, uh, everything set, everything is enabled. --> Hit X and just say run once on the convert CSV to JSON. --> Okay. --> And if you refresh, there should be one on the JSON file name. --> Perfect. --> Also, you noticed how when it came in, it was 225 bytes. --> When it came out, it was 745 bytes, right? --> You know, JSON documents are, are much bigger because they have a --> lot more logic involved, right? --> CSV is just common, you know, just values, right? --> You know, with a comma separator or tab separator or whatever, um, you --> know, JSON document has a lot more, uh, depth, a lot more logic. --> And so, you know, of course the file size is going to be bigger. --> So, um, you, so the next one is the set the JSON file name. --> So, you know, if you remember the attribute going in is the --> file name of that file, which is inventory CSV, we want to update --> that file name to inventory CSV dot JSON. --> So run that once. --> And that's where we set it in that update attribute when you went to configure. --> Okay. --> And then refresh again and you are successful. --> So now we have, excuse me, um, now you have a 745 byte file that should be --> named inventory CSV dot JSON ready to be written to the file system. --> So go ahead and run, write JSON file to directory and refresh. --> I don't see a failure. --> So, uh, let's look at the configuration for that right --> JSON file directory. --> Uh, it put it in sample output data, um, cancel. --> I don't know where that, I don't know, um, create a, create a --> folder where you would want to put this right quick, please, sir. --> Um, so go to your, like downloads on the left, uh, there you go. --> Then your downloads create a new folder called, um, you know, data output --> or whatever you want to name it. --> If you go to the far right, you can, uh, right clicking to say new --> folder right there to right click. --> Yeah, right there. --> You should go say new folder. --> Nope. --> Uh, you, you're actually go down below. --> You see that SD delete go right below that. --> Don't click on that, but go right below it. --> Right click and new folder and you can just put like output. --> Now go into that folder. --> Perfect. --> Go into that. --> Now just click your address bar and it will resolve that to a full path. --> Control a Chelsea and then copy a right JSON file to directory. --> Configure it and right click and say, you know, open that first up, there --> you go, paste it in, you can do it. --> Control H O B. --> There you go. --> Say, okay, um, conflict resolution is replaced. --> Perfect. --> You can apply. --> Um, okay. --> So let's clear your queue. --> So let's go ahead and right click on, on right JSON file directory. --> Let's say start. --> And let's start your set JSON file name. --> All right. --> And let's go to convert CSV. --> You noticed how I'm working backwards. --> There's a reason I'm working backwards on the flow is all the --> queues are clear except for at the beginning. --> Um, and so, yeah, now hit refresh. --> Let's see how quick you processed 8,000 files. --> Yeah. --> See how quick that worked. --> You, you, you picked up 8,000 CSV files. --> If you hit right click again, it'll probably be finished. --> Refresh. --> Yep. --> And you got 4,000 files queued for delivery to the file system. --> Right click and say refresh 1000 left for writing. --> So, oh, and you still got one queued up above. --> So you processed, you know, 8,000 files and less than two seconds. --> I think it was. --> And on a, on a old windows machine, right click again, refresh it. --> Also notice how I did not have him turn on the log error message. --> The reason being is I want anything that goes to failure to queue up. --> So I can see why it went to failure. --> I don't want necessarily want to log that error message just yet, because --> if I was, if I turn on log message, it's going to send that file to the log --> message, it's going to write the error and it's going to auto terminate. --> And that file is gone. --> Um, I want it to queue up so I can see what's going on in case of a failure. --> So right click again. --> And that one single file hopefully is done. --> Also notice if you, oh, there it is. --> If you can go to your top right by the search, you see the red little --> box now, um, the, um, the, it's cleaning up, uh, you know, some of the, yeah, --> it takes a minute to clean that up. --> So because, so the instance that we're running on, we're, --> we're using very little settings. --> So we're not using a lot of Ram. --> We're not using a lot of CPU, the content repository and those types of --> things, they are not configured, you know, you know, with the best --> performance right now. --> So it's going to just take a minute for NIFI to kind of clear things up and --> write it out so you can, you can see it's clearing itself. --> If you refresh again, it may, oh, can you stop your get file from directory? --> Cause you just processing it over and over and over again. --> Yeah, you, you, uh, you've outputted almost 11,000 files already in the --> last less than one minute, uh, already over 11,000. --> Okay. --> Let that clear, but that's your queue and, and you should be good to go. --> Uh, since we're here, go ahead and go out to your main NIFI canvas. --> Go back up a level. --> Perfect. --> And you can see now just at a quick glimpse that you have this process --> group, if you refresh, you can see there's 20 files in the queue. --> It read 11.27 Meg, 11.27 Meg. --> It's wrote, uh, 8.84 in the last five minutes. --> Uh, there's no more files in the queue. --> You have two processors stopped four that are running. --> So, you know, we know that the two files that are processors that --> are stopped is the log message. --> We know that you also stopped to get files. --> Um, and so it looks like your queue is clear. --> Uh, if you won't, you can actually right click on that processor group --> and you can stop all, all processors. --> So if you click stop, it's going to stop all your processors. --> If you click start, it's going to turn all of them on again, and you're going --> to be processing another 10,000 files every few minutes, every minute, uh, --> just because it's picking that same file up over and over. --> If you want to click on, right click on that process group again, --> you know, you let's look at this for a minute. --> So you can actually configure that processor group. --> Go ahead. --> And, you know, it's going to take you to the controller --> services that is being used. --> Um, go to general tab. --> They go, uh, you can rename this processor group. --> Uh, you can add parameters to it. --> Um, there's, there's all kinds of, you know, default back pressure --> object threshold is 10,000 files. --> We, you know, we've seen, we saw that yesterday with Richard where he --> had 10,000 files in his queue. --> If he wanted, you know, if he wanted to increase that to 20,000 right --> here is where you can do that. --> Or, you know, I think, uh, he had a 1.1 gig and it was like 400 files. --> So you can actually manipulate those values here. --> Um, you know, just as a side note, uh, if you are backing up with 10,000 --> files, um, you know, try to distribute that process over a little bit, uh, --> just because you don't want all the processor that single processor doing --> the work, you know, doing so much work, utilizing resources, but go --> ahead and exit out of that, do an X on that, and then, uh, right --> click again on that processor group. --> And my apologies, I'm just using you as a training example. --> Um, you can, you know, enter the group. --> You can start stop processors. --> You can enable that group. --> You can disable that group. --> You can enable all the controller services, disable all the controller --> services associated with that group. --> You can also go down to download flow definition and say with external --> services, and it's going to say is a JSON document. --> You remember when you imported a JSON document into your --> processor group to create this. --> Now you can export that out and send it to your colleague, uh, you --> know, in those types of things. --> So that's what I was saying earlier is we have templates, but we also --> have a, you know, this way of doing things as well, um, you know, so --> go back to you, right click on your processor group again, and you can --> now center in the view, you can group it, you can copy that group. --> You can empty all the cubes. --> Um, you know, if you have everything stopped, this is a good way to, --> uh, quickly stop everything. --> If you've got your cubes are full, you, you know, some things, some --> things, you know, just running and, and filling the cubes up and you --> don't, you want to diagnose it, but you know, you just don't have --> time to look at it right yet. --> You can stop everything, clear the cubes. --> And when you have time to go back and look at that processor group, --> you're going to delete it. --> You can create a template from that processor group. --> Um, you can view any kind of input, uh, ports, output port connections, --> uh, the status history. --> So, you know, we went, looked at the status just a minute ago --> in the hamburger menu. --> Uh, you can look at that specific processor group. --> Um, so, you know, kind of keep that in mind that a processor --> group is more than just like a collection of flows. --> There's a lot of capabilities, you know, with a process group. --> Okay. --> So there is, I think yours is good to go. --> Can you open your folder to see if you actually outputted any data? --> No. --> So you, you ingested a CSV file and you wrote a JSON document. --> Perfect. --> Anyone else have any issues, uh, with this, um, or creating the --> flow or any questions on this flow? --> Oh, you're no problem. --> And again, my style of training is probably a little different than --> most where I like to just, you know, be very interactive. --> I like a lot of questions. --> I like conversation. --> I like just walking through things. --> So, you know, if you have a question, let me know. --> Um, so yours looks good. --> Uh, Travis, I think you did, uh, you got yours working. --> It looks like the last good, good. --> Yeah. --> Last class, we had a couple of folks who, you know, they try to set this up and --> they missed one crucial step and that was in the CSV reader when it was said --> that you want to treat the header, you know, true or false. --> Um, and everyone, like half the class, couple of folks in the class said --> false and so it was throwing errors. --> And so we had to go through and diagnose it. --> So, you know, just kind of keep that in mind. --> Um, but it looks like everyone got it running. --> I think, uh, everyone got a JSON document out. --> Um, so good deal. --> All right. --> Now we are going to move over to the actual hands-on scenario. --> Please use this flow as reference, copy, paste, do what you need to do --> to, you can steal from it. --> Uh, you know, the best developers in the world steal the, their code --> from, from the interwe, you know, the internet and that's what they work off of. --> Um, you know, so, so do what you must, but let me bring up the scenario. --> And again, the scenario is, is, um, you know, own your, um, it's already, --> it's already there, but if you want to follow along, but I'm --> going to present it as well. --> There we go. --> So you shouldn't have a folder that says not five scenario in your uploads. --> So the scenario is you are a data analyst and a local government agency --> responsible for monitoring environmental conditions across --> various locations in the region. --> Uh, your task is to aggregate, transform and analyze weather data --> collected from multiple local weather stations to provide daily summaries and alerts. --> Now I know I have alerts in here. --> Uh, I know that also, you know, you, you may send this to a system that provides --> an alert and a dashboard and those types of things you may send as an email. --> You may send it, you know, whatever. --> So when it comes to the actual alert, when you're building your flow, I know --> that you're not going to be able to send an alert to my cell phone, for --> instance, however, I will, I, what I'd like to see is, is how you are --> thinking you plan to send this alert, how you got the alert, how that worked --> into your data flow, those types of things. --> Um, so, you know, I'm using it not five to, to automate the collection, --> transformation and reporting of the weather data. --> Um, the source weather data files are generated daily by local weather --> stations and stored in a directory on a local server. --> Uh, each file contains hourly data, including temperature, humidity, --> wind, precipitation, and here I will open one of them. --> So you should have two CSV files in a JSON doc, a file. --> So here's the station ID, ST001, um, the date, uh, which is 56. --> Um, if you want, you can go in and update the date. --> That was the, the last training class, the date that they took this --> and, and so the date state. --> Um, hour zero to 23, temperature, uh, humidity, wind speed, precipitation. --> They're just random values. --> Um, so, so, you know, that is the, uh, the data. --> So your task is to set up an IFI flow to monitor the directory in just new --> files as they appear using the get file processor. --> If I provide suggestions, uh, of the processors to use, uh, you can use --> extract text here, you can use convert record, you know, where, --> wherever you feel most comfortable. --> Again, not if I can be used many ways to build a data flow. --> And, um, the beauty of it is, is, you know, there's a lot of flexibility --> and a lot of freedom. --> That's also the downfall is, is you can skin a cat many, many ways when not. --> So I'm providing just suggestions. --> However, uh, you know, we want to go look at all the processors. --> You know, I showed you how you can search processors. --> You can filter on the word cloud, you know, those types of things. --> So what we want to do is start setting up our data flow to ingest this and --> transform into JSON, which will allow for easier, uh, way to analyze. --> And then we can, you know, from there, we can extract the text --> or whatever, convert record. --> And then we want to do a data enrichment. --> So we want to enrich the data with, you know, you can add a time --> stamp or precipitation data, you know, those types of things. --> Um, the data enrichment step, you know, it's, it's, it's kind of optional. --> Um, but the data aggregation, so aggregate hourly data into a --> daily summary for each parameter. --> So, so, you know, station one, station two, station three, here's --> an aggregate of the hourly data, you know, for each weather parameter. --> Again, you can use merge content to combine all the records, their hints. --> And, and I think it was, and I don't know if, if this, I'm assuming --> you all know each other, uh, but I think it was Ben, Ben on the last --> class did a badass, sorry for my language, uh, a really amazing, um, --> I think it was where he was combining these records and those types of things. --> Uh, I don't think he got complete with all of it, but I really liked --> his thought process that he went into. --> Um, so there's a different ways alert generation. --> So, you know, uh, generated an alert if certain conditions are met, right? --> So if you see a high temperature for that, that station or that day or that --> hour, you know, you may generate alert. --> What I'm looking for here is, is, you know, here is the alert, you --> know, it's station zero zero two at 2300 hours had a extremely high --> temperature, uh, and some sort of anomaly, some sort of story. --> Um, and so, um, I know you're not going to be able to send me an --> email or a text or any of those things, but you'll, you should be able --> to get that alert out and we can at least just view it as an attribute --> or view it, uh, you know, in the queue or whatever, um, some additional --> considerations, you know, controller services, key processes to focus on. --> Uh, you know, I'm, I'm given the goal here is give you as many --> hints to way to do this as possible. --> Again, there's many ways you notice on the last processor, I was converting --> CSV to JSON, if you look at your weather data, you will have, uh, two --> CSV files in a JSON file to work with, right? --> Because not all sensors are the same. --> Um, so the goal here is to pick all these files up, get them into a --> format that is common that we can work with and then extract and --> compute. --> So that being said, any questions? --> Clear as mud. --> Um, so how, how are we going to do this? --> Um, you know, I would go to your main NaFi canvas, create a new --> processor group called scenario one and let's get started. --> Um, I am sitting here. --> Um, I'm going to try not to talk as much. --> Uh, I, I'm, I'm here for Q and a. --> I'm here for, Hey, here's what I'm thinking. --> You know, what do you think? --> Um, my goal, like I said, is after lunch, this is going to take us a few hours. --> After lunch, after we get through going through the, you know, building this --> out, I am going to go around the room on each person and I just, I'm --> looking for that story, you're a data analyst, you know, you know, I'm --> just looking for that story of here's how I did this. --> So there's no questions. --> Um, you know, let's get started. --> And then as you see, I have everybody's screen pulled up and I'll pop into --> your screen to check on you and, you know, provide commentary along the way. --> Again, also, um, I'm looking at, you know, beautification. --> I'm looking at all the items and all the little points that we've touched --> on for the last day and a half now. --> Um, so, you know, just kind of keep that in mind. --> Um, last class we had some amazing looking flow. --> It wasn't necessarily a fully operational, but, you know, uh, it's --> still quote unquote passing because, you know, it was a very good story --> of how it was done. --> It was laid out very beautiful. --> It had everything labeled in those colors. --> I think, uh, someone actually broke it up into multiple process groups. --> They were using an input port and an output port. --> I know we haven't went over that, but they're, you know, it's there. --> Uh, the documentation's there. --> Uh, we had a couple of people use a funnel, which I wasn't, I, I on --> purpose, you know, didn't explain. --> And I was really surprised at, you know, using some of the tips and --> technologies that, that I didn't necessarily go over use the, uh, flow --> that we just went over as an example. --> Um, you know, you should be good to go. --> Just checking in to see if anybody has any questions. --> If you do, let me know. --> And we'll work on this until we go to lunch. --> Um, and then, uh, uh, when we get back, we'll finish it up working on it. --> Leroy, you're still looking really nice, but Darius, you're looking nice as well. --> It's like, Peter's got about, I've got it figured out. --> Tom, I like how you've got all your processors coming out there and you --> know, now you're working probably to build those connections. --> Yeah, but I, I have no clue, honestly. --> Okay. --> Well, that's, that's, uh, you want to put the freaking processors out there --> based on your documentation and then it was going to try to mess with it from there. --> But, well, let's kind of walk through it. --> Right. --> So you've got to get your file, uh, from somewhere, um, they are, so, you --> there's two CSV files in a JSON file, I think. --> So we would probably want to make all of those JSON. --> So you could use that previous, uh, example that, that, that, that --> data flow that we use, so you could get the file update, uh, attribute. --> You may want to filter your file on only CSVs. --> So on your get file, you would put like, you know, go into the file --> filter and just add, you know, dot CSV on the end, I think it's dot star as --> the default, so you want to pick up only CSV. --> So do dot CSV and that makes sense. --> Yep. --> And there you go. --> And then, you know, walk through the previous example and, you --> know, where you use the controller service, set that controller service up. --> Um, and let's get these, I don't remember how to do the controller --> services that that was, okay. --> I think we quickly went over that. --> Like, I thought you could figure it within this process or service. --> Yeah. --> Well, let's go back. --> Uh, let's go back to your, uh, the flow that I sent over for when I --> went through the controller service. --> So, so here we, we got the CSV file. --> We set the schema name because we're going to need to build a schema. --> And then we use the convert record processor, uh, to --> convert it from CSV to Jason. --> So right there we're set dot schema dot name with a value of inventory. --> Um, so that way, yep. --> And so go ahead and cancel on that one. --> And then we use the convert record to ingest that CSV file and go --> configure. --> And you know, when you use that convert record, you're going to say you --> want to use a CSV reader. --> So hit that little arrow and it's going to take you to the controller service. --> Yeah. --> So follow along, you know, that's the reason we went through this one is --> because it's going to help you immediately kind of get things started. --> Uh, so it's a little cheat, but yeah, if you can, you know, in your --> flow, just kind of work, you know, you want to ignore that Jason document --> that, that you have already because, you know, because you already have a --> Jason document. --> So what we're going to try to do is, you know, you may want to open --> that Jason document to make a schema that resembles it. --> And if you look at your, um, you know, when you go into your --> Avro schema, you're going to see, you know, we put a sample schema --> there when we did our flow, you can actually copy paste, add a couple --> of values and you should be good to go to have all of that as Jason. --> Once you get it all as Jason, you can then extract those Jason --> elements and you know, then you're mostly done with the whole flow. --> And where was that at that file that had the, all the curly --> yeah, that configuration that is in your Avro schema registry. --> You clicked off of it, go back up, you know, configure and then just go to the --> controller service. --> There you go. --> Click it. --> And if you look at the second one is the Avro schema registry, click --> your gear and there's your schema in inventory property. --> There you go. --> Oh, that's all right. --> Yeah. --> If you kind of use this flow, you should be able to get to where all --> your values are Jason, and then we can do an evaluate Jason path --> processor and, and do some, some good math. --> I know we're coming up on lunch. --> Um, uh, here, here in a few minutes, uh, the goal and this is for --> everybody that, you know, the goal here is to get this started. --> Um, feel free to work on it through lunch. --> I I'm going to go grab a sandwich and come right back. --> Um, but, uh, but yeah, let's just keep working on it. --> Um, and, and, but take a lunch as you need, uh, and do that. --> You should be able to get through, you know, just think about it where, --> you know, you're trying to aggregate these values. --> So you want everything in like a common format. --> So if you can get those CSVs converted to Jason and just outputting, --> you know, all Jason, then we can look at it and evaluate Jason processor. --> If you go into the processor list, you should click on Jason and then you --> have, you know, processors there to, to, to work with Jason, but, but work --> on that Tom, and then, you know, if once we get to the Jason part, you --> know, let me know and we'll walk through some more. --> Okay. --> Yeah. --> And, and again, for everybody else, like, like the scenario just --> mentioned processors, you can use all 300 plus processors we have. --> Um, and, but if you get hung up or you're just stuck and you can't go any --> further, please let's just, just, just speak up and let's, uh, and get --> you past that, uh, hurdle that you may be at. --> And, and, uh, if you could get a second, I have, um, on my workflow. --> Let me look at it. --> I think I got a quick question here. --> All right. --> I'm bringing yours up right now. --> Okay. --> So, and I can walk through it too. --> So on this one, you know, getting at least one file so I can look at it. --> Right. --> Um, in here, I just, I hope to get the workflow, but, um, I came in here. --> I am, uh, setting my schema needs to weather, right. --> Weather data. --> And then I am, um, and I apologize for the latency here. --> It's kind of had to click around a little bit, um, in here, uh, this is --> where I am getting my error, but maybe you can, um, if I can figure this one. --> So what I did is I sent my schema name to, uh, whether, so as far as my --> reader, um, what I ended up doing is, uh, well, I'll start here with --> the schema registry portion, right. --> So, um, I did configure my schema. --> Okay. --> Let's look at your schema real quick. --> Let me just, uh, let's get this out. --> Another little tip here. --> Store everything as a stream. --> It makes it a lot easier. --> Um, okay. --> So station ID, I tried to play with, uh, they didn't like that. --> And then I just, I at least want to bring in the data, right? --> Yeah. --> And so I think this is, uh, okay. --> I think it gets to a point where it reads it. --> Um, and so then, uh, in here, this, this one's pretty --> straightforward unless I'm overlooking something. --> No, that's you. --> So the only thing that that controller service does is enable the Avro --> reader and it allows you to, to have that Avro schema. --> Yeah. --> And then, uh, so then, uh, let me look at my reader. --> I do have, and then, so you scheme a name property. --> Yep. --> Um, I am looking, I made sure the, I didn't notice the first header is. --> Treat first line as header. --> True. --> Perfect. --> Because if not, it's going to error. --> Yep. --> Okay. --> And then, um, come in here. --> Same. --> So using the same. --> Uh, registry, the same schema, I guess, and then the schema name, --> which is whatever, right. --> So I'm overlooking anything here, but, um, so what I do is I was just kind --> of stepping through it, um, step by step and I'll get to here. --> It'll, it'll crash. --> I think I'm already. --> Well, yeah, you've already got an error. --> Let's look at your error before you do that. --> Before you do that, let's just write your error. --> Yep. --> You're the top, right? --> Yeah. --> That's where I forgot how to get to. --> So I was navigating in here and I was looking at the. --> Well, you have the top right of your convert CSV to JSON. --> You see the little red box. --> This one. --> All right. --> Nope. --> This one. --> Nope. --> Nope. --> On this convert CSV to JSON. --> Look at that processor. --> Top right. --> Yep. --> There we go. --> Failed the process. --> There we go. --> That's the error I want to see. --> Uh, incoming data error or input string. --> 11.17, which is probably the value in that, right? --> Um, look at yours. --> Let's go back to your schema itself. --> Yeah, actually I put it here. --> Oh, beautiful. --> I guess. --> Yeah. --> So I was looking at the data and then, so what I ended up doing is I just, maybe --> I think wind speed might be, uh, maybe I was thinking, cause I was trying data --> types, I tried to use a double, I think. --> And then they don't like that for some reason. --> They do not. --> So the last class run into this and my recommendation was keep it all strings --> for now. --> Okay. --> Yeah. --> And I think that's what it is. --> That actually I just noticed wind speed is not an integer. --> No. --> So just keep it all as strings and, and you should be good. --> And I saw this exact, um, one of the other, your, your fellow classmates --> on the last one did the exact same thing. --> Uh, but yeah, keep it a string, see if that'll work. --> Uh, you may have to mess around with it, but that should unblock you. --> And then when you're done with that, you're going to have --> three JSON documents, right? --> Okay. --> Perfect. --> So let's, let's make it a goal to have a three JSON documents shortly after lunch. --> Now it's, it's a 11 36 y'all's time. --> Um, uh, again, this is very working lunch for me. --> So, uh, what I'm going to do is, is, is, is pause it. --> You know, I'll continue answering questions right now, but when we're --> done with questions, I am going to go grab my sandwich, uh, that my wife --> made for me and I'm going to eat it at my desk and, um, that way I can answer --> any other questions and things like that. --> So, uh, I'm going to put up, uh, the slide that says, you know, we're --> going on lunch, be back in 45 minutes, but feel free to work through this. --> Um, but hopefully that will fix your problem, uh, Richard. --> And, but I like the path your own, right? --> You're, you know, you're using that previous processor. --> You want to get everything together. --> So if you can't just say, okay. --> And while I've got your screen pulled up, uh, you know, let me take a --> look at what you're doing past that. --> Go ahead and say, okay. --> And go back to your main canvas. --> Right quick. --> Sorry about that. --> I thought you were going to drive. --> Oh, no, no, no, no, no. --> I try not to take them. --> Um, so you're going to set the, so right JSON directory. --> So basically it's my, the last flow and you're just updating it. --> Okay. --> Perfect. --> And so --> so that I understood each of those processes, right? --> No, that's we're, yeah. --> We're reiterating what we already learned. --> So no, that's perfect. --> Um, and then I'll, I'll get creative again. --> Well, good, good. --> Okay. --> Well, hopefully that will square you away. --> If not, just let me know. --> I don't see hand raising sometimes on the teams because of the, I'm, --> I'm actually watching you all. --> So again, for anybody else, just feel free to just blurt out. --> Hey, Josh. --> I have a question. --> Um, but yeah, you should be good to go, Richard. --> How's everybody else looking? --> Tommy, a little less frustrated now, hopefully. --> Yeah, I am going to step away for a little bit, but I'll come back --> to it here in about 20 minutes. --> Okay. --> All right. --> Um, Peter, it looks like yours is good. --> How are you doing? --> Is there a way to copy that entire process group to this new process group? --> Uh, you, you got to copy each process at a time. --> No, you could actually, you know, hold shift and act, uh, you're --> coming in really slow, um, but I heard that you're good. --> I think, um, you're just working through it. --> Perfect. --> Yeah, I just want to copy the process group. --> Yeah. --> So you can, you can hold shift and copy the whole flow that we worked on --> previously and put it in a new processor group and, and then --> start adding to it. --> Right. --> Um, also you remember when I had you create a new process --> or group and you uploaded the JSON document I sent over? --> Um, and, and so you could, you know, you could start with a new process --> group, start with the previous flow and start mocking it up. --> That's what I needed to do. --> Okay. --> All right. --> There we go. --> So take a break for a minute, clear your mind. --> I always feel that like, you know, I'm going to get something to drink --> or something to come back and like, it helps me like think about this --> while I'm not looking at a screen. --> All right. --> So I think, uh, anybody else have any quick questions before we, you --> know, break for lunch again? --> Um, you know, I'm, I'm, I'm running around for during lunch, just --> grabbing food and, and, and snacks and drinks. --> So I will be around my desk and if anybody has any questions, feel --> free to let me know. --> Um, it is 1140 y'all's time. --> So say 20. --> I also forget how you import. --> No worries. --> No worries. --> I got you. --> I got you. --> So I think it'd be 1225, 45 minutes. --> I'll show you this, which is two 25. --> All right. --> So, um, okay, Thomas. --> So here's, here's how you do it. --> Uh, you brought down a new, did you bring down a new --> process or group already? --> Yeah, scenario one. --> Uh, right click, you know, okay. --> Right click, say delete, do it again, but this time don't hit. --> Okay. --> Bring down a new processor group. --> There you go. --> Bring it down. --> And then to the right, you see the little upload. --> And then CSV to Jason demo data flow. --> They open and, and there you go. --> There you go. --> You got a head start already. --> I remember now. --> Okay. --> Thank you. --> You're very welcome. --> Well, I'm going to go grab my food. --> If anybody has any questions, uh, or anything, I, um, --> I had this issue the last class. --> I can't type in chat sometimes. --> Um, but I can see chat. --> Um, so, you know, raise your hand, just blurt out my name or --> something and we'll get you squared away. --> Right. --> Let's make it. --> I set the time, but like I said, I'll be back at my desk. --> So, uh, here in a few minutes, if anybody needs anything. --> Do any type of innovation because you would --> I'm sure. --> It's different than the version two. --> Yeah. --> I want to eat it. --> I can't believe I'm eating it. --> It's so good. --> Wow. --> That's so good. --> Good! --> You want to eat it too? --> I'm going to eat it. --> I'm so hungry. --> I'm going to eat it. --> I'm so hungry. --> I'm so hungry. --> I'm so hungry. --> I'm so hungry. --> So now, there is not one, but only one. --> One of the rules is that it can be bad, and it can be bad. --> So, you stand up to the wind, and then you go over to the wind. --> You have to stop on the one, and you go over to the wind. --> Then you have to get out of the light of the wind. --> You have to go over to the side, and then you go over to the road. --> This is stopping. --> Firstly, you stop. --> Then you go back to the wind. --> Stop. --> All right, so hopefully everyone is making their way back from lunch. --> I'm in here. --> If you have any questions, I might miss a little bit of the afternoon session. --> Okay, Richard had to drop. --> But if we can, let's just continue working on your flow. --> I'll give us a little bit more time to check in. --> We should, you know, at least have our CSVs as all JSON. --> You may have to, if you use that convert record, you may have to modify that schema, --> like Tom and I were going through, as well as Richard, before the lunch break. --> You're more than welcome to share notes or things within chat. --> I don't mind at all. --> So if someone has a good schema, they want to copy and paste into chat, you know, have at it, --> so we can all share if that is your approach. --> But if you have any questions, just speak up, and I'll answer. --> I'm going to start going through and looking and seeing how we're doing. --> But, yeah. --> I do have a question on mine. --> Okay. --> They have it all set up here, but I have an error on the convert CSV to JSON. --> All right, let's look at it. --> It says that the record reader and record writer are validated against invalid --> because controller service is enabling. --> And I did right-click over here and click enable all controller services, --> and that didn't seem to do anything. --> Okay. --> Let's go to your convert CSV to JSON and let's look at the properties. --> Go ahead and say configure. --> And let's go look at your CSV reader, so click the arrow. --> Go ahead and, there you go. --> So they are invalid. --> When you hover over the exclamation, so you have a problem with your schema. --> How about the other one? --> They're probably invalid because of the schema. --> Is it popping up? --> There we go. --> Okay. --> So let's look at that first one, the schema registry demo CSV. --> Let's go to the configuration of that. --> And let's look at, well, you're using inventory. --> You're still using the inventory name. --> Oh, okay. --> Yeah, I changed the schema over here, so I have to change the name on the left side as well. --> No, are you using the schema name inventory for this example? --> No, I changed it to weather. --> Okay. --> Yeah, so there's no reference to that. --> So actually, if you want, say, you know, copy your schema. --> Just go into the box and hit control A and control C. --> And then say okay. --> And say disable and configure at the top right. --> Perfect. --> And go ahead and do plus so we can add a new property. --> All right. --> And then what did you name the schema inventory? --> Weather. --> Weather. --> Say weather. --> And then say okay. --> And then paste your value, control V. --> Or right click and say paste. --> It's not working for me. --> Control V, is it working? --> Hit cancel. --> Okay, let me try to copy it again. --> Yeah, there you go. --> And then go back to your value. --> Click it. --> It doesn't seem to let you. --> I keep having to double click on a lot of stuff. --> So it might be the copy and pasting might not work the first time either. --> Okay. --> So yeah, I mean, you know, we are dealt the cards we are with latency and stuff like this. --> Okay. --> So you have, I would delete inventory because you're not using it for this flow. --> So, you know, there's a trash can right there. --> Perfect. --> Say apply. --> And let's see. --> Invalid. --> It's still invalid because the schema is not a valid Avro schema. --> Can you paste what you, your schema in the Teams chat? --> And I can take a look at it. --> I was having the same kind of problem. --> Is there anything wrong with leaving it as inventory with the processor above that before that? --> You can think that you can leave it as inventory, but the schema will change because the data has changed. --> Right. --> Okay. --> But if you want to name it inventory and, you know, have added, I would change the name though and create a new schema. --> If that's the process you want to take. --> So, let me, let me look at this schema. --> Give me just a second, Peter. --> Let me see if I can. --> Okay. --> Okay. --> So, type name is weather. --> String, string, string. --> We're missing, we're missing something here. --> Give me just a second. --> You have a comma after precipitation. --> So, go back to your, go back to your model or your schema. --> Configure it. --> And look at that. --> Do you have a comma after precipitation? --> Precipitation should be your last one. --> Yes. --> Yes, I do. --> Yeah. --> So, commas don't belong on the last, there you go. --> Say okay. --> Okay. --> Say apply. --> There you go. --> And enable it. --> You can just say enable. --> All right. --> Close that. --> Enable your CSV reader and your JSON record set writer. --> We've cleared that error. --> So, you should be good to go. --> But yeah, you're in an Avro schema. --> You don't put a comma on the last line there. --> Okay. --> That makes sense. --> Thank you. --> Yep, yep, yep. --> All right. --> Any other questions? --> Tom, you doing good? --> Yeah, I'm taking, that's what I was trying to do. --> Yeah, I think, yeah, that helped me. --> I did the same thing. --> I still needed it. --> I mean, inventory was working, but I'd rather do it right. --> Yeah. --> Yeah. --> I mean, try to do it, like, again, it's there to copy off of, but try also to, --> you know, make it as unique as possible. --> But that's, you know, the beauty is it's only going to get you to JSON. --> We still need to, you know, extract the JSON and stuff like that. --> Gotcha. --> Thank you. --> Mm-hmm. --> No worries. --> So, you know, as we work through this scenario, you may, you know, --> instead of writing that file to, you know, pushing that to the disk, --> you may want to then send it to extract JSON. --> So, you know, the way I would handle this is I would, --> I would read that JSON back in. --> I would extract all the attributes, or I would extract all the fields --> and then save it as attributes. --> And then I can, you know, write that back however I want. --> So, there is a NiFi extract. --> Let me get the exact name. --> Processor. --> It's evaluate JSON path is the processor. --> So, what that will allow you to do is bring that JSON in --> and extract the values you would need. --> If you use the evaluate JSON path processor, --> make sure the destination is set to flow file attribute. --> And if you do that, you should be able to extract every value --> from that JSON document and have it as an attribute. --> So, just, you know, and again, you know, you have the documentation there, --> the help documentation as well. --> Feel free to ask me, you know, as we go through and build this. --> Hey, Josh, can you repeat that again? --> I think you said use something to extract all. --> Yeah. --> So, if it was me, I would send all the JSON documents --> after everything's converted to JSON. --> I would send it to the evaluate JSON path, --> and in the property configuration for that, --> you want to make sure your destination is set from, you know, --> content to attribute because that way you can extract all the values --> because if you do content, it's only going to extract one. --> But if you do, you know, save as attribute, --> you can then quickly plug in like, you know, --> to extract all the values, you know, in the JSON tree. --> You may have two different versions of evaluate JSON --> because you've got, you know, --> you're bringing in a CSV converting it to JSON --> and then you've got your original JSON. --> So you may have to evaluate JSON processors. --> And, you know, and if you, you know, if you don't know how, --> just let me know. --> But in the evaluate JSON, you know, here is, --> let me see if I can open this image. --> For instance, you didn't evaluate JSON. --> Yeah, you know, it wouldn't be dollar dot ID, --> but it's going to be dollar dot for the data. --> Let me look. --> Inventory. --> So if I was doing this. --> So for this one. --> I would do evaluate JSON, --> and then I would start adding properties. --> And so the first property I would add is station ID. --> And I would say dollar dot station ID. --> And that should extract the ST003. --> And then this one I would have date. --> And then I would have on this one dollar dot date --> because it's looking at the JSON tree. --> So you can actually do like, you know, --> if this JSON document had embedded, you know, --> in the tree, in the hierarchy of the tree, --> if there were, you know, child nodes, --> you could drill down even more. --> But because everything's a top level, --> you can create a new property. --> There'll be a plus right here. --> Here, let me just, let me just cheat and show you. --> I want to send all of my JSON to this processor. --> Well, I would send the CSV I converted to this one. --> If the JSON is different, --> I would send it to another one. --> Instead of flow file content, I'd do flow file attribute. --> And I would do, let's see, station ID. --> ID, and then I would do dollar dot ID. --> That's the name of the field. --> Because it's going to look in your JSON --> and look at the field. --> Station ID, date, hour, okay. --> So then I would do the station ID. --> So it would extract the station ID. --> I would do the dollar dot date. --> Hour. --> So forth and so on. --> So what that would do is the JSON coming in --> would be read by this processor, --> and then it's going to extract those fields --> out of the JSON and save those as attributes. --> So now I no longer need to worry about the file. --> I just need to worry about the attributes. --> And from there, I could, you know, --> I can do a lot with attributes. --> I can. --> Let's see. --> What do I do next? --> Filter attribute. --> Maybe a filter. --> Let's see. --> Let's see how I would do this. --> I might be able to do this with an update attribute. --> I might be able to even use a scan. --> Let's see if I did update attribute. --> So everything coming out of the evaluate JSON path --> would be an attribute. --> Then if I did update attribute, --> I could. --> Let me think. --> Let me see here. --> Let's see. --> I'm thinking through this. --> After I have taken all the values, --> I've pushed them up as attributes. --> And so now I have a list of attributes per JSON document. --> And now I just need to manipulate those attributes. --> Let me think. --> Let me think. --> Let me think on the, I'll keep giving hints, --> but let me think how I would do this on this, --> this way we're doing it. --> Let me think on this. --> Let me think. --> Let me think on this. --> Let me think on this. --> Let me think on this. --> Let me think on this. --> Look up the attributes here. --> Let me see here. --> Let me think on this. --> Where were you at when you were putting in those properties? --> For the JSON evaluate? --> Yeah. --> Those? --> Yeah. --> Yeah. --> Okay. --> Yeah. --> So by following, --> if you're sticking to the previous example --> and not coming up with your own way, --> by following that example, --> I would skip setting the JSON file name --> and I would not do writing file directory. --> I would actually take it out of here. --> I would go to evaluate JSON. --> I would add that. --> Then in the evaluate JSON, --> I would extract the values. --> And then from there, --> and that's what I'm working on now, the next step. --> I would write a regular expression to calculate those, --> but I'm trying to think of another way besides a regular expression. --> Okay. --> Because I just think it would be a little easier if, --> like writing a regular expression would be the best way, --> but you know, --> and that's why I keep looking at the math regular expression. --> Oh, gotcha. --> We usually use Red Decks 101 --> when we have to do regular expression type coding work. --> But I'm still, regular expressions still blow my mind, --> so it's still hard. --> Oh, yeah. --> Especially when you get complicated. --> It is. --> So you have to add the properties manually, --> because I don't see, like, example properties. --> Yeah, you're going to have zero properties, right? --> Okay, yeah. --> Right? --> So you're going to need to add, --> I mean, this is an exercise zone. --> Here's the name of the property and here's the value. --> And the dollar dot station ID, --> because all these values are top level, --> it's going to go look at station ID and extract that. --> That's what it should do. --> Let me double check to make sure. --> Okay. --> Thank you. --> No, let me match. --> Add. --> Let me test out my theory as well. --> So you're. --> All right, let me run this once. --> That's the only thing I was going to ask you. --> Is it normal for the active set schema to run --> with only one in the queue? --> That's when you get to CSV, you get two in the queue, --> and then you set the schema and then there's only one. --> At least for me there is. --> Well, you're picking up two CSVs. --> I think it was, right? --> Yeah. --> Yeah. --> Let me go back. --> Yeah. --> So you should get two JSON documents. --> Okay. --> Interesting. --> I'll take a look at that too in just a second. --> All right. --> So I got my. --> I want to see if my theory here works. --> Okay. --> I've matched. --> Well, this is my queue. --> And now my attributes. --> I should have. --> Take our empty string. --> Let me find out what I'm doing wrong. --> Well, why is this? --> Oh, well, of course it's not going to extract --> because I am not picking those files up. --> Let me go through and. --> I want to find my flow to get it to work. --> Oh, perfect. --> So I'm going to quickly walk through. --> It sounds like most of you are building your flow. --> Yeah, I'm a, I'm a keep it simple guy. --> So the simpler the better. --> I would always approach anything like this, but. --> I get it. --> I will. --> I a hundred percent would not try to get fancy out of the right off the bat. --> I would be like minimum required. --> Were your requirements? --> Okay, here you go. --> Meet the minimum. --> And then go from there. --> Well, and I was actually expecting some of that, --> but it seems like everybody just took the previous example --> and worked all of that because, you know, --> you could actually have done this without any controller service. --> And it'd been a little bit easier. --> Oh, yeah, but it would have been a lot more processes though. --> I released more processes. --> It would have been more processors, but less logic. --> Okay. --> Oh, that's how I started building it out. --> And you saw all the processes I had on my canvas --> at the start when you first were looking at mine. --> I had so many because I was taking your documents literally. --> I was like, okay, then do this, now extract this. --> Yeah. --> Do you know what I mean? --> Like each little piece I was creating a processor for. --> No, it was purely an example of getting us started on the scenario, --> but you could have went any direction. --> Okay. --> So I need to go. --> Then I would have had a canvas full processor. --> It looks ugly probably too. --> Yes, but then you wouldn't have had to deal with Avro schemas. --> You wouldn't have had to deal with controller services, --> you know, all those. --> So, but either way, we'll get it going. --> And like I said, this is the hardest lift. --> Tomorrow will be a lot easier for the most part. --> All right, disabled. --> That's right. --> I am going to copy yours. --> Yours. --> Okay. --> That's better. --> All right. --> Looks good. --> Okay. --> Let's see. --> Getting it. --> Let's take a look at the new one there. --> Okay. --> Okay. --> Let's see. --> Hey, Josh. --> I'm trying to do this. --> It's probably maybe not the right approach, but could you take a look at mine? --> I got to get Jason from directory one for reading the, getting the CSVs and one for getting the Jason, but the one for being Jason files, it's not working. --> Okay, let's look at this. --> So, yeah, let's go. --> When I run, it doesn't really work. --> I don't know where it says go figure. --> Put instead of Jason put dot star. --> Just put change Jason to star. --> Yeah, see if that like grabs files first, because we want to test our rejects pattern first, because that's what I was having just having an issue. --> Go ahead real once. --> And then refresh. --> Four. So, yes. --> So that's what I was just actually setting up. --> And, you know, what I am going to do is how I'm going to, like, I was just literally doing the same thing. --> So what I'm planning to do to solve this, this problem is I am going to, so I get, I'm going to get everything. --> I'm going to apply a model to it no matter what. --> And then I am going to. --> Well, I wasn't planning to build this flow with you all. --> That's going to help you, but we will knock it out. --> So I am going to get the file. --> No, I'm actually not even going to do that. --> I am going to get the file, and I am going to route on an attribute. --> Let me check to make sure that this route. --> Route on an attribute. --> I was trying to do that too. --> Like send CSV to one branch and send JSON to the other. --> You got it. You got it. --> But it wasn't picking up JSON. --> Oh, no worries. Well, we are going to pick everything up on this. --> So I am going to change this to, this name should be get all. --> All files. --> From a directory. --> Okay. --> And I want to do route on attribute. --> And I am going to go to route. --> Route to property name. --> Say okay. --> And then I am going to say CSV. --> I am going to say okay. --> And then I am going to go with dollar curly brace. --> File name ends with CSV. --> And then close that. --> Okay. --> File name ends with CSV is going to go one way. --> And then I am going to add another one that says JSON. --> And I am going to say if the file name ends in JSON. --> Okay. --> I am going to apply. --> And then I am going to take this and I should have a CSV relationship. --> All right. --> And then JSON relationship. --> I will get the JSON document. --> I will send it here. --> I will send JSON there. --> I don't need this success. --> You guys are putting me on the spot. --> I've got to build on demand. --> And then I am going to. --> It's not matched. --> I am going to just log the message on unmatched. --> And okay. --> What am I missing here? --> My regular expression is not right somewhere. --> Oh, it's contains. --> Contains is valid. --> Apply. --> I mean. --> Oh, no, no, no. --> Okay. --> Perfect. --> Okay. --> So let me empty my queue. --> All right. --> That's getting weather data. --> It's getting everything from weather data. --> Run this once. --> I should have. --> You files in my queue. --> We have two CSVs in the JSON. --> There we go. --> So I picked everything up. --> It's in CSV where it needed to go. --> It's in JSON where it needed to go. --> And the last CSV it sent where it needed to go. --> Okay. --> So how I. --> Did you see how I accomplished it? --> Yeah, I did. --> I saw that. --> But it's still not picking up my files. --> I have no idea why. --> Let's look. --> Oh, wait. --> Let's change it. --> Oh, yeah. --> Yeah. --> Instead of doing a file filter there. --> Let's just round it based upon. --> You can actually do a round with the file name ends as well. --> But yeah. --> I mean, this is one way to do it. --> Right. --> So yeah. --> So you can route an attribute. --> Okay. --> Good. --> And then on your route on attribute, go ahead and configure it. --> Can I send unmatched to lock errors? --> Yes, ma'am. --> That's perfect. --> Okay. --> Oh, and I think you're. --> Yeah. --> Go ahead and run once and then run once again. --> Because you got three or four files. --> So you want to make sure that they separated. --> Okay. --> Run it again because they're CSV. --> Oh, you have to run it twice to get to both the CSVs to show up. --> You do. --> Because. --> Oh, just say Q once. --> Okay. --> I see what you're saying. --> Yeah. --> Because, you know, you run the get files and they picked everything up. --> But then you've got four files. --> Okay. --> Looks like it worked for you. --> You got JSON one way, CSV another. --> Perfect. --> Thank you. --> You're welcome. --> Okay. --> I might as well finish building up my flow. --> Oh, my God. --> CSVs. --> Weather. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> CSVs. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> This is okay. --> So if you're watching what I'm doing, I've got my flow set up where it's picking everything up. --> It's routing JSON one way, CSV another way. --> The CSV is getting routed to my convert record from the previous flow and then going back to my evaluate JSON path. --> The problem now that I'm running into is it extracted the values as an attribute, but it's only extracting one value from the JSON document. --> So I actually now need to split the JSON records. --> And that way I get all the attributes for each hour of the day. --> So if you're following along, that's where I'm at. --> And I'm about to, you know, I'm about to split these records. --> So what I like to do is --> I'm going to split the JSON. --> This is going to be a spider web. --> Go away. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> So from one JSON document, I should, you know, I got 24 records, which 24 hours of the day. --> And if I look, it should just be one record. --> So from there, I will, I'm going to evaluate the JSON path. --> Let me run this. --> And now everything should be an attribute. --> Yep. --> So I was able to pick all the, pick all the files up. --> If they were not a JSON, I made them JSON, took all the JSON, extracted each individual record, and then pushed the single record to the evaluate JSON path. --> I extracted all the values. --> And now I have all the values as an attribute. --> From there, then I need to put in some logic to, to, you know, what, what is an alert and then send the alert. --> So I'm almost done. --> Hopefully you all, you were able to follow along and or do it your own way. --> And, and go from there. --> But we'll, we'll give it a few more minutes. --> Then we'll take a quick break. --> Our last final break of the day. --> We'll come back and try to finish up what we can. --> What we may do tomorrow is just kind of walk through, everybody walk through their flow and what they were thinking in the morning. --> And then give you extra time after class to touch up anything you want to touch up. --> But if you have any questions, you know, just speak up. --> I'm here. --> Now I'm going to figure out how I'm going to design the rest of this flow. --> And I'm already 11 processors into this. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Anyone stuck, have any questions? --> I'm where I'm at on my flow is I am working on calculating the values, utilizing processors that we have here. --> But, but, you know, if anyone else is stuck or having questions, please let me know. --> If not, we'll take our final break of the day. --> We'll just take a few minutes and just take a quick bio break and grab something to drink and then try to finish up as much as we can today. --> So we can start off tomorrow going through these flows and finish up with some registry and other topics. --> So does anybody have any questions right now? --> I did. --> I was pretty lost and I was just learning where should I start it. --> I don't think I can get the final reading right. --> Okay. --> Let's take a look at it. --> Again, if anybody gets stopped, a hurdle blocked, anything, you know, please speak up immediately because I want to get you over that hurdle so we can get these complete. --> But let's look at yours. --> Okay. --> So, you know, on this one, you can, you can do a file filter. --> Did you look at the red text expression? --> Or the unknown? --> The, for, for, you know, the, the file extension, right? --> So if you want to do, go ahead and get, go back to your other one because I figured this out earlier. --> No, go back to where you just were. --> All right. --> And so you want to do CSV. --> So modify that. --> Take out CSV, but leave the rest, but take out CSV. --> And do dot asterisk backslash dot CSV and say, okay. --> Say apply. --> See if that works. --> Run once. --> I got put in the right folder. --> I don't think you need that backslash before dot CSV do you? --> Oh, I see. --> Take that. --> Yeah. --> Take that backslash out. --> You may not need that. --> Take it out. --> I'm just matching the other. --> Go ahead and say, okay. --> And see if that works. --> Run that once. --> I think you may take one you thought of. --> Well, I think it might be star dot CSV after the bracket, I think. --> Yeah. --> Go back one. --> There you go. --> Refresh. --> Yeah. --> Hang on. --> Let me look at mine. --> Well, let me work on my local version here and fix it. --> What would happen? --> Can you go back to your rejects there? --> And let's do take open that and do a asterisk. --> Yeah. --> Start fresh. --> Start fresh. --> So do an asterisk frontslash the other one. --> Backslash. --> I mean dot CSV. --> All right. --> See if that will. --> Trying to remember the apply the file filter. --> Go ahead and try running that once. --> No, it's already telling you. --> It's already telling you it's wrong. --> What I did and what worked for me. --> Oh, you rock. --> Oh, there you go. --> Well, that's what I had a minute ago was the slash dot. --> He had asterisk instead of the up carry. --> Oh, I see. --> I see. --> I didn't see that part. --> Okay. --> That worked. --> Awesome. --> And then when I was going over it, here's how I approached it. --> So come on. --> So when they asked the question, what I did is I was getting all files and sending everything. --> And then I did a route on attribute. --> And I said, you know, the file name contains CSV or the file name contains JSON. --> And then if it's JSON, it goes here. --> If it's CSV, it goes here. --> But you should be good to go. --> Yeah, I like the way you're doing it now. --> I prefer to do that way. --> But I got lost in the sauce too when I tried to do it on my own sauce. --> I've kind of just, I don't want to say given up. --> Well, let's not give up. --> Let's see where you're at. --> I mean, I just think, you know, like some of these flows require extensive knowledge and or experience. --> But the platform, if you're not there, you're just going to struggle, honestly. --> Yeah. --> Well, this scenario, you're going to struggle. --> And that's why I keep asking, like, please, if you get stuck, ask me a question because I want to help you through. --> Because once you get through this, creating other flows would be a lot easier. --> So you are routing on attribute, right? --> Yeah, I like that approach after you were talking with that. --> Yeah, it's a better approach because if you use a file, that way you pick everything up to begin with, right? --> And you don't have to have another file just to pick up CSV, another file for JSON. --> You just pick everything up at once and start sorting and filtering. --> OK, so you've got the route. --> I actually started out. --> That's how I started. --> And then I backtracked and was trying to keep it simple using what we had already done. --> And then then you maybe changed my mind again when you actually was working with that. --> So I was like, no worries, no worries. --> OK, so you got route on attribute. --> So are your CSVs and JSONs going where they need to go? --> And then you clear the queue if you want. --> And then your CSVs are basically going to follow the same example as you uploaded in the previous, right? --> So we're going to go here and then down to here. --> Yep. --> We're still converting everything to JSON, right? --> We're still converting CVS files to JSON files, right? --> Correct. --> So format, whatever. --> Yeah, so you can get to a common format, right? --> So and that's, you know, and so you are routing on attribute, the JSON. --> You're sending it to the JSON. --> You're sending CSV. --> You're going to update the attribute. --> You're going to assign it a schema. --> Did you use Peter's schema? --> Yeah, well, yeah, for the most part, yeah. --> I'll look at it real quick. --> Yeah. --> Well, you'll have to go into the convert record. --> Oh, yeah, this one. --> Yeah, sorry. --> No worries. --> And so let's look at your CSV reader and writer. --> It's enabled. --> But go ahead and go to that one there. --> And you have the weather. --> Is that the same as Peter's? --> Basically. --> Okay. --> Okay. --> Okay. --> Say okay. --> It's a little bit off center, but I actually still work. --> All right. --> And you did not put the final. --> You're fine. --> I can see it. --> Okay. --> Say okay. --> Okay. --> And then exit out of that. --> Looks like all that's enabled and running. --> So where are you putting your success once you. --> Because I had something here and I got rid of this. --> I need to still link this to something. --> Yeah. --> So you want to probably. --> You can split the. --> You can do a split before you evaluate your JSON path. --> If you won't. --> So that's what I did is. --> I went to the split JSON. --> And I just split on every record. --> So it's dollar dot star. --> So that splits on every record. --> And so, yep. --> And then split JSON, configure it. --> And then the JSON expression was dollar dot star. --> Okay. --> That's that's okay. --> Astric. --> There you go. --> Okay. --> Say apply. --> All right. --> And then you've got your splits. --> You hover over your split JSON. --> What are we missing? --> We have success. --> We have split. --> Failure. --> I got to send it to the log. --> There's logs right there in the middle. --> Send it to it right quick. --> There you go. --> And then you can actually just right click on log error message. --> Copy and then paste right beside it. --> And that's where you want to send your original. --> For now. --> So this right click, say paste right there. --> There you go. --> And then drag and drop. --> So you get your original. --> There you go. --> Walking me through it makes it seem like it's a lot easier. --> Well, I get it though. --> And again, right. --> That's why I keep warning. --> This one's a hard one. --> But we've got to learn, you know, controller services. --> We've got to learn some of the concepts. --> And this brings in multiple concepts. --> It brings in controller services. --> It brings in regex expressions. --> It brings route on attribute and attribute manipulation. --> This scenario is set up to test all the parts. --> But again, you know, some of these parts you're just being exposed to. --> And so that's where you. --> Hey, Josh, just a second. --> Where do I do this? --> And that's what we're doing now. --> So say apply. --> But I think also. --> But I also think knowing what you put in for some of these values. --> It's also, you know, I mean, it's like it might be in the document. --> I didn't really. --> The documentation is kind of. --> Not overly. --> You know, doesn't really go deep dive. --> It's kind of just like an overview. --> OK, here. --> Here are the properties. --> Yeah. --> You know what I mean? --> So you got to know what to put in a lot of these values. --> And if you don't know, you don't know. --> It's like then you have to Google or chatbot or whatever. --> Yeah. --> And that's what I. --> That's why I usually just have like the red. --> I think I closed the window, but I had the red expression, --> expression language guide. --> I always keep that up as well. --> And then that way you can look at like. --> File name or something else like that. --> It'll let me know. --> So anyways, you are good there. --> Let's look at your others. --> OK, so now you're evaluating Jason on path. --> So let's configure it. --> That looks correct. --> I know the destination. --> You won't flow file attribute because if you do flow file content, --> it will only extract date. --> Oh, OK. --> Yep. --> And that's what I mentioned earlier. --> That's a common. --> That's what I missed. --> We added these. --> So apply. --> Now you need to finish. --> You know, if it's unmatched, I'd send it over there to that original. --> If it's an error, I'd send over to your left and and, you know, --> go from there and get rid of that right Jason file to directory --> and get rid of that failure point. --> You know, those types of things. --> And that's where I'm at right now is I am thinking of how I want to easel. --> Now I can use rejects and pull all these attributes in. --> And I can actually do a divide a math function. --> But a math the math function is actually for the more advanced. --> Not five users. --> So I don't want to do that. --> So I'm trying to think of a easy way to do it. --> Where are you sending original to for the evaluate? --> You want to go to the log. --> OK, so after the split Jason, I'm sending the splits to evaluate. --> And I'm sending the original to a log we are on message. --> OK, gotcha. --> And then the evaluate Jason path is going to log and then OK, --> Yeah, so I'm working. --> So that's a placeholder. --> So I can test to make sure that what I'm expecting comes out. --> And now, as you know, the beauty of this is like if I want to do, --> you know, some other thing, right? --> I do something else for match. --> Like what would make more sense for match instead of the rolling window? --> That's what I'm trying to think right now is, is, is how do I want to do this? --> Utilizing the processors I have that doesn't require regex me to write Python. --> Because, you know, I, you know, I realized that we have a mix of technical talent on the call. --> And there's no being against anybody. --> Right. --> There's technical folks and there's managers. --> There's there's different types. --> And so what I'm trying to do is what is the simplistic version I can make? --> And so that's where I'm at right now. --> So after I get through helping you all, I'm going to think through how can I generate an alert based upon this data and what alert would I generate and how would that look? --> OK, I mean, a lot of the work I do is to keep the lights on type of work, you know what I mean? --> So but we're trying to get more agile and DevOps. --> You know, I know I get it. --> But here's the beauty, right? --> This is designed to, you know, you know, start you off nice and easy and then make it really difficult. --> And I think I've accomplished that. --> But, you know, we're but again, I'm very, very open. --> So, you know, stop me say, hey, I am stuck here. --> If you are working on a processor and it's taking you longer than, you know, a few minutes, you may just want to say, hey, Josh, right. --> I'm trying to do this. --> This is what I'm thinking. --> Because, again, what I'm looking for is not a complete flow. --> What I am looking for is that story. --> Here's what I'm thinking about this data flow. --> Here's what I'm going to do. --> Right. --> And you're building a full fledged data flow in less than a day. --> So, you know, I'm not expecting a complete flow. --> But what I do want to hear is like, here's my story. --> Here's what I plan to do. --> And here's how far I got, because that lets me know that you're going down the right path or not. --> And, you know, going from there. --> And I like seeing how some folks are filtering right when they pick the file up. --> Some are filtering as soon as they got the file. --> You know, there's many ways to, again, there's two different ways to pick files up. --> And so, you know, I like to just see how some people, you know, think about the problem. --> And that's the biggest thing with workflow-based programming, right, is just how you're going to think through that. --> But I think, okay, so you should be good to go. --> You should have attributes now. --> So if you look right here. --> So I have, for instance, all of my attributes. --> So everything is stored in memory. --> And I don't need, like, so I no longer need the CSV files or the JSON files. --> Here's our one, humidity 88. --> We pick another random. --> Here's our four, humidity 71, right. --> I no longer need all of these files. --> I could write every one of these back as a CSV real easily. --> And then, you know, do some quick Excel math and call it a day, right. --> So, yeah, okay. --> So I think, are you in a better state now? --> Yeah, I think so. --> I mean, I'm still going to struggle the rest of the way, I imagine. --> But that's, yeah, no. --> No. --> And please stop me. --> I mean. --> No, it's okay. --> I think it helps just talking it through and listening to you explain it better for more, whatever. --> Yeah. --> And again, if I need to explain more, I do not mind at all. --> I have six kids, right. --> I'm used to explaining. --> All right. --> Cool. --> Thanks, Josh. --> Yep. --> Any other questions? --> Hey, Josh. --> I'm having trouble with this split JSON as well. --> I was trying to use the merge content and I tried to merge records, but those weren't working out for me. --> I saw that you were showing someone else to use the split JSON, so I tried to copy that, but then I wasn't sure what this error is facing path expression. --> Is it valid memes? --> Let's look at the property. --> So you don't have anything. --> So you see that JSON path expression is bolded? --> Yes. --> So it's a required field. --> But all you need to put in is dollar dot star because it's going to split on every record. --> So dollar dot star, just like I have mine. --> Dollar dot star. --> And so that's going to split every record. --> So each record is going to have temperature, humidity, wind speed, precipitation, hour, date, and station ID, right? --> So it's going to split those records. --> So what's it complaining about right now? --> Okay. --> Now it's just complaining that it's not connected to anything. --> Oh, there you go. --> Well, start. --> Wire it up. --> So that's taking the – it's reading the JSON file over here, and that's feeding it over. --> The CSV files are up here going down, converting, and making their way down over here. --> So those are all working. --> So this split JSON can take that from both of those directions, the JSON files directly, and the converted CSV files? --> Correct. --> Okay. --> And then this next – put this off over to the side for now. --> This evaluate JSON path is the next one. --> Correct. --> And all these are top-level records. --> So it's dollar dot the ID, right? --> So dollar dot station ID, dollar dot date, dollar dot hour. --> And that's going to return hour. --> That's going to return temperature, you know, those types of things. --> Okay. --> Yeah, let me show you mine. --> See. --> So all I did is I went in – yeah, dollar dot date. --> Because that's going to tell – all that's doing is telling this processor to look for dollar dot date, which is a top-level record. --> If the date was embedded under station ID, you would have dollar dot station ID dot date. --> And then it would pull that, you know, depending on the JSON tree. --> But everything is top-level. --> This is a simple JSON. --> So you don't have to worry about arrays and embedded anything, you know, those types of things. --> Okay. --> And so once you have that, then you're going to have everything as an attribute. --> So everything will be in memory. --> It's no longer a flow file. --> So you can actually dump all your flow files and, you know, everything is going to be in memory as an attribute. --> After the evaluate JSON path. --> Where is your original relationship going, Josh, from that evaluate JSON path? --> Is that going to the log? --> Yes, it is going right here to the log error message, I think. --> Okay. --> Hang on, let me check my lines. --> I got a spider web going here. --> I don't think unmatched is going to the log. --> Oh. --> Failure goes there. --> Matched goes to where I need to be. --> Unmatched goes to log. --> And so the original is actually split. --> So I'm sending my split JSON to the log error message and then auto terminating. --> Because I don't care about the original anymore, right? --> Like I've already, you know, I've already got the original. --> I've already either converted it or extracted all the values. --> I no longer, you know, in this scenario, I no longer care about the original document. --> I now have those attributes that I can like do whatever I want with now. --> Okay. --> So, you know, it's still a flow file going through, but it's attributes. --> And once you have like attributes in memory, then manipulating those attributes is relatively easy. --> Because now, you know, and that's what I'm looking at like for mine. --> Like we'll just use another random one here. --> I have a date attribute, an hour attribute, humidity attribute, precipitation, the value, station ID. --> So I could, you know, I could turn around and take all of this data, send it through, write it as a CSV file even. --> And then, you know, export it as Excel and then have some Excel templates. --> So when I open it up, it automatically computes even. --> So, you know, that's why, but I'm trying to think of a way to do this, you know, very simply through a processor instead of having to write code or anything else. --> Because normally I would just write some Python or something like that to handle it. --> But we should all be at least to hear very soon to where you have data that's all in a common format that you can access. --> We should be at that point, hopefully by now. --> All right, this is Peter. You're right. You get to go, Peter. --> I just said my session expired, so I had to re-log in. I just got back in right now. --> Oh, okay. Yeah, so after that split, you want to evaluate JSON path and, you know, you can send both failure and original, you know, make a shortcut, right? --> The failure and original, send it to the error message, send your splits to the JSON, right? --> You know, make it easy for yourself. All right. --> I'll be right back. I mean, we're already over the break. We went through break. --> So give me just a minute. I need to go grab something to drink. --> All this talking, my throat starts getting rough. I'll be right back in five minutes. --> Did you split JSON, split each entry, has a different file for you guys? --> I think that's what it's supposed to do. --> Because after going through split JSON, I had 55 queued, and that does not sound right. --> That might be... --> Multiple CSV files. --> Or it might be each file is a single attribute. He had a ton, too, and he had like 24 of them. --> It's each record. It's split into like multiple lines. So that's what you're seeing. --> Okay. Thank you. --> It's stuck. --> I think I configured my services wrong. --> Okay. --> Do I still need the service if I use the route on attribute? --> Let's take a look. See what you got. --> So you are getting the weather data. You're doing a round on attributes. --> You're sending JSON one way, CSV another. --> You're sending the CSV to the schema name because you're going to convert it to JSON. --> Success. And so you're going to convert that record to see it. --> What is the error you're getting with the convert record? --> I think it's one of my services. --> Yeah. Your services are not set. So if you want to take all this in and set it as making it JSON. --> Yeah. Let's look at your services again. --> Do I just click on either one? --> Yeah. It don't matter. Either one. --> And you have a fifth. --> Okay. So go to the left and hover over the little yield sign. --> Did you copy the chat record that Peter put out? --> I believe so. --> Did you remove the comma that Peter had an issue with? --> And I think Tom as well. --> Yeah. Let's go to the Avro schema registry. --> Let's look at your weather schema. --> Yep. Right there at the end. That one right there. --> And you might want to take out those extra spaces. --> So do a delete or backspace. Get rid of that comma. --> Nope. Get rid of that. There you go. --> You can expand that window so we can see more. --> There you go. --> And then scroll up and get rid of the extra lines on every line. --> Right here? --> Yep. Perfect. Delete. There you go. Keep going down. There you go. --> Is that okay? --> Do that for you. --> And apply? No, it needs a comma after the others. --> Enable that. All the way to the right is the lightning bolt to enable. --> Say that's fine. Service only. --> Close it. It looks good. --> All right. So let's hover over our CSV reader. --> Let's go to the gear on it. --> Use schema name property. --> Schema.name. --> Oh, on the schema registry, click that and say Avro schema registry. --> Say okay. Say apply. --> All right. Enable it. --> Okay. --> Jason record set writer. Let's do the same. --> You probably, you know. --> Yep. There you go. --> Say okay. And apply. --> Enable. --> Say enable. There you go. --> Close. Go back to your processor. --> And what's the issue now? --> No, no, no. We already looked at that one. Say cancel. --> Go to the yield. --> The relationship failure is not connected to anything. --> So you need a file. --> Where do you plan to send all failures, right? --> I'd probably just send it to that log error message on your left. --> And send your failure there. --> There you go. --> Refresh. --> And success. --> You're going with the extract text. --> I wasn't really sure. I gotta read the top again. --> Most everyone is going with the, --> going into a split JSON --> to bust that JSON file up into individual records. --> And so you can, you know, even though it's in the queue, --> you should be able to drag and drop it over there. --> Use the front of the arrow. --> There you go. Take the blue little box. --> Drag it over to split. There you go. --> You'll have to do the rest of the terminations. --> Like, you know, here's where you send the splits. --> Here's where you know from the split JSON. --> There you go. --> So send that where you, --> what are you going to do with the, --> what are you going to do with it when you, --> what's the thought here? --> What does the split JSON do again? --> So you're splitting the JSON records into a single record. --> So if you look at the queue, --> look at your queue, look at that one file. --> Right-click and say cancel. --> Right-click, list queue, --> and go to the little eye over there. --> Actually, you can go over to the right eye. --> Yeah, that one. --> Click that to view the content. --> And Pretty Prince not on, --> but you can already tell that you've got, you know, --> also the header is not set properly. --> Click instead of original, say formatted. --> Click the view as original. --> See if it says formatted. --> There you go. --> So it looks like, you know, --> the first record is just the station ID, --> which means, which that tells me that in your schema, --> you did not set the treat first line as the header file. --> So we need to fix that. --> But you now have what? --> Showing one, two, three, four, five, six, seven, --> eight records showing right now. --> If you scroll down, you're going to have probably 24 records. --> So what the split JSON does is it's going to split --> all of these records into its individual file. --> So you'll have just the station ID, the date, the hour, --> the temperature, humidity, wind speed, and precipitation --> as one file, not 24 records as one file. --> Okay, so each one of these will be one file? --> Okay. --> Great. --> And then, like I said, you want to go back, --> exit out of that. --> That works now. --> You want to go back to your controller service --> and make sure that you're treating the header as true, --> telling it it does have a header --> because it looks like you're ingesting --> the header as a record as well. --> So if you go to CSV reader, --> you can go to the proper gear icon on that. --> Scroll down. --> Treat first line as header. --> True. --> You can disable and configure. --> There you go. --> Perfect. --> Good job. --> Okay. --> So that should take care of the header information --> being a record. --> And then now you're going to, --> you can split the JSON. --> So if you were to clear your queue --> and run that again, it should show up properly. --> The split JSON, you're going to need, --> go back into your split JSON. --> Since we already have it open, --> you have no value set for JSON path expression. --> So it is required. --> So you would just want to do dollar dot asterisk. --> There you go. --> Say okay. --> Apply. --> There you go. --> And now you'll need to finish with its connections. --> You don't have a relationship. --> Sorry. --> You don't have a relationship --> or some of these others that you'll need to put in. --> From there, if you're following exactly what I'm doing, --> you will send it to a evaluate JSON path processor. --> And then from there, you can fill in. --> So here, I'll pull mine up again. --> If you can, look at mine. --> I send it to an evaluate JSON processor. --> Make sure you have full file attribute. --> And then I just extract every value out of that single record. --> And so what happens is, and it's just dollar dot date, --> dollar dot hour. --> There's no nested or embedded records. --> And so if I look at my records, --> here is the actual record. --> But because I did the evaluate JSON, --> I now have all of these as attribute values. --> So the wind speed, temperature, station ID, humidity, the date. --> So this is the 13th hour of 2024, 506. --> And then now that I have that, --> you can take attributes and --> save them all if you want and combine them all into one document. --> I'm working on how I would do this --> with using the processors that are available. --> But go ahead, and if you can, --> I mean, you're probably going to get stuck on that point. --> But if you can, just go and start getting your flow cleaned up, --> named properly, labeled properly, those types of things. --> Make sure your relationships are there to do the evaluate JSON. --> And let's at least get to where we have all the same data --> in all the same format, --> because we're going to need to have the data in the same format. --> We're going to need to have it all together --> before we can make any kind of calculation --> or look for any alert or anything else. --> Perfect. Anybody else? Well, who else? --> If you wouldn't mind taking a look at my queue. --> I don't see the attributes like you do. --> I think maybe I'm missing a processor along the way. --> Okay, no worries. --> I know we're running short on time, though. --> No, no, you're fine. --> The second day, we usually go over on the second day. --> I need to go grocery shopping and that's it, so I'm good. --> I mean, after my split, --> I have both JSONs and CSV, so I think... --> Well, that's fine. --> Because it's just a file size. --> Let's look at your attributes, so scroll down. --> Okay, I don't see, hit okay. --> That's what I was looking for. --> I don't see what you see. --> Yeah, no worries. Hit X on that. --> Let me look at your flow. --> Okay, we are splitting the JSON. --> It's split. Did you do the... --> Oh, the split is only going to give you a record. --> So it's going to still give you a JSON document, --> but it's just one record per document. --> So 54 sounds about right. --> Then you're going to do the evaluate JSON path. --> All right, let's look at your configuration on that. --> I think the last time we looked at it, --> yeah, you have $.date, okay. --> So say cancel or apply, --> and run it once. --> When I ran it, it went to failure, I believe. --> Let's look at that failure real quick. --> Go to that failure. --> Right click on that failure, --> and list the cube, --> just like you would any other cube. --> Oh, gotcha. --> Oh, each one of these has its own cube. --> Yeah, they're all separated by their own cube. --> Yeah, go ahead and click that. --> You are just pulling ST003. --> Huh. --> Why in the hell would you be doing that? --> Let's go ahead and hit X, --> and let's actually clear that, --> because it might be a header issue. --> We'll see in a minute. --> Just go ahead and empty the cube. --> Do another one once. --> All right, what's the error on the box? --> On the processor? --> On the right of the processor, --> you've got now a red box? --> Oh, yeah. --> Flow file did not have a valid JSON content. --> Okay, so we're converting it to JSON. --> Empty the split cube. --> Right above it. --> Yep, empty that cube. --> Maybe I didn't run once in enough times. --> Let's look at your split JSON. --> What do you have? --> Figure. --> Dollar, dollar, star. --> Okay, apply. --> And then your convert record is going to success. --> Can you run once to get us to split JSON? --> On this? --> Well, you'll need a file to provide that one, --> so you'll have to get CSV from directory. --> Now, refresh that. --> And it should run, --> so just say route on attribute, --> say yes, go ahead. --> Run once. --> Actually, you're going to need to run it three times, --> because you have three files, two CSVs, and a JSON. --> So we know that that's working. --> So run it one more time, --> and it should be a second CSV. --> Perfect. --> All right, so look at your split JSON, --> because you already have a JSON document. --> You're ready to go over there. --> Okay. --> Let's run that once. --> Did I clear this? --> Yeah, go ahead and clear it. --> You don't need to, but let's do it. --> As a matter of fact, let's not run that split JSON once. --> Just turn it on and start it. --> Refresh. --> All right, and now stop it. --> Let's look at your split, --> the split queue, --> the blow it. --> There you go. --> You have seven files now. --> All right, let's view that. --> Say okay. --> Click the eyeball. --> I want to view the content. --> Is it still? --> Okay. --> All right, exit out of that. --> $.star. --> Empty stream. --> And it's just getting the JSON. --> Run that get CSV from directory again --> until you just run it. --> You'll have to run it once to get CSV. --> Run once. --> And then the route on attribute, --> you have to run it twice to get the JSON. --> We know the first one's a CSV. --> The second one is... --> Oh, you can't do... --> Click on the processor. --> Refresh first? --> No, don't click on the red app. --> There you go. --> Okay. --> And then refresh. --> We should have a JSON document now. --> Let's look at that JSON document before it goes in. --> All right. --> I want to double check to make sure. --> Hit your eyeball. --> All right. --> Okay. --> So that is a... --> I see 24 JSON records. --> Let's exit out of that. --> Exit out of this. --> All right. --> The split JSON. --> Open up and configure it. --> Okay. --> JSON path. --> Dollar dot star. --> Empty 20. --> Okay. --> Hit cancel. --> Why is that name? --> You see the... --> Take your split JSON --> and that processor. --> Can you kind of drag it up a little bit --> and align it to the right of route attribute? --> There you go. --> All right. --> So there is where your... --> Let's empty the queue --> of the... --> Between route attribute and split JSON, --> let's empty the queue of that JSON, --> that one document you have. --> Yeah, go ahead --> and empty it. Perfect. --> And let's say... --> I'll say okay. --> And then right click and say delete --> on that connection. --> That split JSON. That one, yes. --> Delete it. All right. --> On your route attribute, --> drag it over to split JSON. --> And --> make sure you have just JSON selected --> and say add. --> Okay. --> So you're getting the files. --> You're routing on attribute. --> What's that route? Can we go look at your route on --> attribute? Hopefully it matches the one --> I have. --> Okay. Say cancel. --> Empty --> the queue from split JSON --> to evaluate JSON path. --> You have a split and original --> queue. Empty both of those. --> Yep. --> Empty your original. --> Okay. --> You get CSV --> from get CSV to route. --> Empty that queue as well. --> This is really weird because it does match mine --> and yours is... --> There's something --> missing. So --> let's run get CSVs from directory. --> You're just picking everything up, right? --> Okay. --> Run that one. --> Refresh. --> Three files. --> Perfect. That's what you're supposed to have. --> All right. Run that --> three times and we're going to get --> two CSVs in JSON. --> First one CSV. --> Yep. --> Run it again. The second one is JSON. --> There you go. --> And run it one more time --> just to clear that queue. --> There we go. And that should go to CSV. --> Perfect. --> So we have a JSON document. --> Let's look at that JSON document in the queue --> to double verify that it's actually --> in this queue. All right. Go to your eyeball. --> Your eyeball. --> I don't see any issue --> with this data. --> All right. Exit out of that. --> Let's hit configure --> before we run it once. --> And that's a dollar dot star. --> Dollar period. --> All right. --> Let me look at mine. --> Double, triple, sure. --> Okay. Okay. It matches --> mine to a T. Say okay. --> Say apply. --> And then --> run split JSON. --> Refresh. --> No. --> I can already tell you it's not right. --> I just wanted to go --> into the log and then seven or --> Yeah. The splits --> should be going to --> the evaluate JSON path they are. --> But the splits are much bigger --> than 29 bytes. --> Okay. Our problem --> is coming out of that split JSON. --> So let's empty --> the split queue. --> It's empty. --> All right. Hang on one second here. --> Let me see. --> Values going there. --> Success from ConvertCSV --> is going in. --> Let's --> bring down another split --> JSON processor and put it right --> beside the split. There you go. Right there. --> All right. And then let's take our --> connections and --> you know the route --> and attribute where you have JSON. Go ahead --> and move that over to your new processor. --> All right. Perfect. --> And --> let's take --> the ConvertCSV to JSON --> line and move it over as well. --> That one. There we go. --> Bring it over there. --> Perfect. --> empty your --> original queue. --> Right beside that split JSON is that --> original connection. --> There you go. Empty it. --> And then delete it. --> And just delete. And let's --> delete. --> Let's delete that split. --> No, not that one. The connection. --> The connection you have right there. Right beside your mouse. --> Right where you had it. --> Correct. Yep. --> Delete it. --> Okay. --> And --> okay. Let's delete --> the split JSON to your --> log error message. You got it. --> Delete it. Okay. --> Go ahead and delete that split JSON that you have up top. --> Get rid of that. --> Let's look at your configuration of this one. --> All right. Let's set our value. --> What was it? Dollar. --> Dot. Star. --> After. Yeah. There we go. --> Okay. --> Apply. --> Apply. --> All right. And let's drag down our first --> relationship. --> And let's go to --> evaluate JSON path --> for the split. --> And then let's bring down --> the original and --> failure to the log error. --> You can do original --> and failure. --> Say add. --> Yeah. And if you see your connection --> it says now failure, --> original. --> I like that. Okay. That's very cool. --> You don't have to do three different --> ones. --> And I went through and deleted all my labels --> just because I --> need to clean mine up as well. --> All right. --> Let's run it. --> And let's see. --> So get CSVs. We're going to run --> once. --> As a matter of fact, just turn on route on --> time to be started. --> And then run once --> on your get CSVs. --> All --> right. And so now your count --> should be eight and one. --> Seven and one. Sorry. --> Math is off today. --> Okay. --> So before we say split JSON, --> let's take a look at that JSON document --> one more time. Because there --> is something funky going on here. --> List the queue. --> Your eyeballs. --> Okay. --> That looks beautiful. --> And it does say it's an application. --> It read the content type. So a lot of these --> processors have a built in mine type. --> Go ahead and exit. --> Yeah. Now go ahead and close this. --> All right. --> Run once. Actually just --> turn it on. Just say start. --> Refresh. --> Still not right. --> This is weird because --> it's exactly like mine. --> Exactly. Exactly. --> Like mine. --> It is dark. --> All right. --> Tom, give me just a second. --> I'm going to do this. --> I'm going to take over. --> Okay. --> That's cool because I got to run to the bathroom real quick. --> I'll be right back. --> So I'm having the same --> exact issue as Tom. --> And I've been following you and it's not correcting. --> Oh my --> Lord. --> It's really weird because --> mine runs. --> Let me double check mine. --> I picked it up. --> Let me see --> here. Refresh. --> All right. --> Wait a minute. --> Wait a minute. Mine's --> erroring out now. --> Even mine is --> now erroring out and it was --> working just fine. --> All right. Well, luckily --> Tom, I don't think it's you. I think it's --> me. --> Oh yeah. --> Because --> I'm running mine --> as well and --> now I'm getting --> the same results where --> previously I was getting --> split document. --> So let me --> I'm going to work on Tom's and get this --> figured out. Ekta, you want to take a quick --> break? You know, have at it. --> I'm going to take a quick break too and I'll be right back. --> Yeah. Yeah. Even --> even mine, I'm getting --> wrong again. --> And mine was working --> just fine. --> Yep. Weird. --> This is weird. Something --> All right. --> Okay. --> Okay. --> That worked. --> And the slips are there. --> It should be --> an individual record. --> Yep. --> Okay. --> Okay. --> Okay. --> So that worked. --> What did you --> do? --> The CSV --> when the CSV is converted to --> JSON and then the JSON is sent --> to the split JSON, it works just --> fine. But when it's the --> when it's just a raw JSON --> document being --> picked up, it's not. --> And --> and so why --> didn't you get that? --> So I'm going to take a look at --> this --> to you. --> It's really weird. --> The time is now yesterday. --> All right. --> Let me look at --> what is the difference in this data --> now? --> No. --> It can't be a problem. --> No. --> It can't be a problem. --> No. --> It can't be a problem. --> Okay. --> Okay. --> Okay. --> That is definitely different. --> That is definitely different. --> Oh --> What about --> Good lord, the spiral web of flows. --> I am very forthcoming when it comes to this, and you know, I try to explain, like, you know, this, it can quickly turn into a spiral web that goes... --> It goes everywhere, and... --> I can see where it would be easy to lose track of where you're at and what's going where. --> Yeah, and like, I try to kind of clean up along the way. Let me delete this. --> Mm-hmm. --> And we, you know, we have, you know, we are putting in a lot of extra logging and stuff for now. --> And we're doing, you know, we're doing best practices, right? Best practice is, you know, add these log messages wherever you can and, you know, go from there. --> But, you know, if we, once we get ready to clean this up, then, you know, then we would, you know, take care of some of these, clean it up. --> But if I don't clean it up now a little bit, I am not going to be able to see. --> Okay. So I have what's coming out of the CSV to JSON going to this. We know it works, and we know it works well. --> Then I have this split JSON with another type of expression. --> That's what I was going to say. I am not setting the JSON file. I mean, that's the one thing I am not doing that you are doing. --> Yeah, I don't even need that step, to be honest. I just, you know, I'm just, it was part of the old file, the flow. And so since we're all working off of the old flow, I decided to just leave it in there. --> I get it. But if I were to sit down and do a, why are you not, did you run once? --> Okay. Turn that off. Run once. --> Should be two and one. One. All right. That ran, that ran, that ran. --> Perfect. --> Two. --> Right. --> That's what I'm talking about. --> Okay. --> What is going on here? --> Okay. --> I think I found the problem. One second. I'm going to fix this. --> So when we are taking our CSV and we are making a JSON document, we are doing it absolutely perfect. --> And so the JSON that's coming out is formatted perfect. It's formatted beautifully. So this JSON, the weather, I don't know what happened with that. It's still a valid JSON document. It's just not formatted a valid JSON document because it's just a bunch of JSON objects, not a JSON array. --> But give me just a second and I will fix that JSON. Because I was like, wait a minute, what the, what is going on? But yeah, I have a fix. I have a fix. We can do this. --> Okay. --> Okay. So there is the split JSON. So okay. So besides, okay, so if you are around on attribute and if you want to quickly, you know, get around this hurdle, if you're doing around on attribute, take your JSON. --> And just move it to a log message for now. You as well. And then that way it will continue doing your splits on the CSVs. And you can go from there. I'm going to, just a second, I'm going to provide you with an updated JSON document that is better formatted. --> So give me just a minute. But, you know, Thomas, you can as well, if you want, you can just take this and we will drag it to here. If the latency will let us, we will empty the queue. --> All right. Okay. So your flow should work now. And you shouldn't have any issues. You're not going to handle the original JSON document, but I'm about to replace it with a new document. And I'll just replace it on your desktop. --> Okay. Let me see. Where are, I'll see where. Server administration. Yeah, I created a folder. Oh, nice, nice, nice. Instead of working out of the downloads or upload, whatever it's. Server administration is what we're used to in our environment. So I was like, let me just mimic. --> Yeah, yeah. Just mimic what you, because, you know, whatever flows you set up for yourself, right, it's going to be very closely related. And when coming up with the scenario, I, you know, tried to model what you all would run into based upon, because I know I've been to YPG. --> I am army. I've worked for the army. So I know a little bit about what y'all's mission is and what y'all do. So I try to model this scenario off of that. Plus of other information I learned about you all. --> And that's why I come up with a data analyst. But the funny thing is, is actually this worked just fine. This is the same data on the last class. But the last class didn't have any kind of help starting off. They didn't have the previous example like we did. --> So they had to come up with their own methods. --> And so they handled it differently and that's why they were able to handle the, I think they were just splitting it by row. --> Actually, I wonder if you could do that. --> But anyways, work on that. I will provide a new version of that data file. --> And I think actually it was splitting. --> We could actually, let me look at one thing real quick. --> Oh, you're slow. --> Very slow. --> Yes, I don't, I contract to the training company. I don't work for the training company. --> So, like, it's not my choice on some things. And definitely not my choice here. --> So I think I can do a split. Oh, I could, I could do a line split. --> I could probably, I'm just thinking of, I'm going to put a new JSON document on you all's desktop. --> But if you wanted to work with that original, you could send it to a split text processor first to split by lines. --> And then you would have a valid JSON document because it's going to be a single record. --> But let me, let me, we're actually way over on time as well. --> So let's pause today. Tomorrow we will very quickly pick back up. --> We will finish this. Again, I'm not looking for a complete data flow. --> What I'm looking for is, you know, here's my thought process. --> Here's what this flow might look like. --> You know, Tom, you kind of had it right earlier where, you know, you laid out all your processors, you know. --> So what I was looking for is more of linking those together, getting as far as you can, which you have. --> And then, you know, what you can't get accomplished. --> You know, we'll get the processors and say, well, you know, I don't know how to do it, --> but I would bring all the attributes in and do a math function or write a little script or something. --> And then I'd send it out as email. And if, you know, that would have completed this scenario. --> So, you know, again, I know this is going to be a struggle on this scenario. --> I promise you we're going to have one more quick and easier one that that you will go away with the victory. --> But the the reason for this one is it really dives into multiple different processors, different controllers, different situations. --> It involves multiple different types of files, multiple files, you know, those things that exercise all of NAFA. --> So so that's the reason we do this scenario. But we'll we'll get it knocked out. --> So with that being said, unless somebody I'm going to upload a new what I'll do is I'm going to project everyone's browser and I'm going to put a new JSON document in your folder that you pick up with. --> So I think everyone's using the file. I'll go through and see where you're getting that file. --> I'll replace the JSON with another JSON document. --> And then you should be able to run your file all the way to splitting successfully. --> And then tomorrow we will quickly, you know, in the morning before break tomorrow morning, we will finish this, not necessarily finish the data flow, but finish the thought will go around the room. --> What I'm listening for tomorrow is going to be, you know, here is my data flow. Here's what I got accomplished. --> Here's what isn't accomplished. Here's my thought process. Here's what I was thinking that I could do. --> Here's, you know, I cleaned it up, the beautification, you know, things like that, because, you know, that tells me you've got the main concept of NAFA as well as some of the main components. --> You may not know all of the components, but at least you know where to go to get information. --> You know where to go to get a processor, those types of things. --> So that's what we will work on in the morning. I will handle the JSON. Anybody else have any questions? --> Okay, so that will conclude for today. Feel free to continue working on this and then I will come in and touch up your flow for just this component. --> And then that way you can, you know, that way you can do whatever you need. --> All right. That's great. Thanks, Josh. Yep. Yep. All right. If that's it, anybody have any pressing questions? If not, I am going to go grocery shopping. --> All right. Have fun. All right. Thanks, everyone. And if you're playing in your machine, I'm going to take control in just a minute. --> All right. Yep. Thanks, guys. Thanks. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> Okay. --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> . --> .