keedio / flume-ftp-source

FTP network server is source of events for Apache-flume
80 stars 61 forks source link

Flume ftp source #31

Closed sangeesivakumar closed 6 years ago

sangeesivakumar commented 6 years ago

I am trying to connect flume with ftp source, i had referred this community https://github.com/keedio/flume-ftp-source.

I have downloaded the ftp source jar for flume in this site and followed those instructions which is mentioned. Tried to connect with FTP.

Here it is pointing to all the files which is present in the home directory, INFO source.Source: Actual dir: /home/xxx files: 31.

I want to point only one file which is present in desktop, for that i have given working.directory property. But it is fetching all the records which is present in home directory.

And inbetween it is throwing some errors while pushing data to hdfs.

18/05/23 15:56:57 ERROR source.Source: ChannelException org.apache.flume.ChannelException: Unable to put event on required channel: org.apache.flume.channel.MemoryChannel{name: MemChannel} at org.apache.flume.channel.ChannelProcessor.processEvent(ChannelProcessor.java:275) at org.keedio.flume.source.ftp.source.Source.processMessage(Source.java:404) at org.keedio.flume.source.ftp.source.Source.readStream(Source.java:376) at org.keedio.flume.source.ftp.source.Source.discoverElements(Source.java:248) at org.keedio.flume.source.ftp.source.Source.discoverElements(Source.java:196) at org.keedio.flume.source.ftp.source.Source.discoverElements(Source.java:196) at org.keedio.flume.source.ftp.source.Source.discoverElements(Source.java:196) at org.keedio.flume.source.ftp.source.Source.discoverElements(Source.java:196) at org.keedio.flume.source.ftp.source.Source.discoverElements(Source.java:196) at org.keedio.flume.source.ftp.source.Source.process(Source.java:102) at org.apache.flume.source.PollableSourceRunner$PollingRunner.run(PollableSourceRunner.java:137) at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.flume.ChannelException: Cannot commit transaction. Heap space limit of 55297reached. Please increase heap space allocated to the channel as the sinks may not be keeping up with the sources at org.apache.flume.channel.MemoryChannel$MemoryTransaction.doCommit(MemoryChannel.java:126) at org.apache.flume.channel.BasicTransactionSemantics.commit(BasicTransactionSemantics.java:151) at org.apache.flume.channel.ChannelProcessor.processEvent(ChannelProcessor.java:267) ... 11 more 18/05/23 15:56:57 INFO hdfs.BucketWriter: Creating xxx/topics/Specified_flume_ftp_data/FlumeData.1527071214656.tmp 18/05/23 15:56:57 INFO hdfs.BucketWriter: Closing xxx/topics/Specified_flume_ftp_data/FlumeData.1527071214656.tmp 18/05/23 15:56:57 INFO hdfs.BucketWriter: Renaming xxx/topics/Specified_flume_ftp_data/FlumeData.1527071214656.tmp to xxx/topics/Specified_flume_ftp_data/FlumeData.1527071214656

What property need to be added to point only particular directory?

My configuration property file

`# Naming the components on the current agent. FtpAgent.sources = ftp1 FtpAgent.channels = MemChannel FtpAgent.sinks = HDFS

Describing/Configuring the source

FtpAgent.sources.ftp1.type = org.keedio.flume.source.ftp.source.Source FtpAgent.sources.ftp1.client.source = ftp FtpAgent.sources.ftp1.name.server = 192.168.1.1 FtpAgent.sources.ftp1.user = xxxx FtpAgent.sources.ftp1.password = xxxx FtpAgent.sources.ftp1.port = 21 FtpAgent.sources.ftp1.flushlines = false FtpAgent.sources.ftp1.chunk.size = 1024 FtpAgent.sources.fpt1.run.discover.delay=5000 FtpAgent.sources.fpt1.working.directory = /home/xxx/Desktop/ftp_sample

Describing/Configuring the sink

FtpAgent.sinks.HDFS.type = hdfs FtpAgent.sinks.HDFS.hdfs.path = hdfs://xxx/topics/Specified_flume_ftp_data FtpAgent.sinks.HDFS.hdfs.fileType = DataStream FtpAgent.sinks.HDFS.hdfs.writeFormat = Text FtpAgent.sinks.HDFS.hdfs.batchSize = 1000 FtpAgent.sinks.HDFS.hdfs.rollSize = 100 FtpAgent.sinks.HDFS.hdfs.rollCount = 100000 FtpAgent.sinks.hdfs.serializer=Text

Describing/Configuring the channel

FtpAgent.channels.MemChannel.type = memory FtpAgent.channels.MemChannel.capacity = 100000 FtpAgent.channels.MemChannel.transactionCapacity = 1000 FtpAgent.channels.MemChannel.byteCapacity = 6912212

Binding the source and sink to the channel

FtpAgent.sources.ftp1.channels = MemChannel FtpAgent.sinks.HDFS.channel = MemChannel

I am running the agent by hitting this below command,

bin/flume-ng agent --conf ./conf/ -f conf/flume_ftp_source.conf Dflume.root.logger=DEBUG,console -n FtpAgent `

lazaromedina commented 6 years ago

Hi sangeesivakumar, please try property "filter.pattern" for matching files in working directory. Example. Best, Luis

sangeesivakumar commented 6 years ago

@lazaromedina Even if i mention the 'filter.pattern' property also it is fetching all the records with that specified pattern in home location itself. Not in a particular directory. For example, FtpAgent.sources.fpt1.working.directory = /home/xxx/Desktop/ftp_sample FtpAgent.sources.ftp1.filter.pattern = .+\.csv I am giving the property like this, i am pointing to the folder which is present in desktop, inside desktop ftp_sample folder is available. Need to fetch csv files in /home/xxx/Desktop/ftp_sample folder. But it is fetching from actual directory /home/xxx

lazaromedina commented 6 years ago

Hi, can you please

sangeesivakumar commented 6 years ago

Yes, i have mentioned FtpAgent.sources.ftp1.filter.pattern = .+\.csv like this only, here i have done by mistake

sangeesivakumar commented 6 years ago

Flume-ftp version --- >flume-ftp-source-2.2.0.jar Config file i have given above

sangeesivakumar commented 6 years ago

18/05/23 17:55:05 INFO node.PollingPropertiesFileConfigurationProvider: Configuration provider starting 18/05/23 17:55:05 INFO node.PollingPropertiesFileConfigurationProvider: Reloading configuration file:conf/flume_ftp_source.conf 18/05/23 17:55:05 INFO conf.FlumeConfiguration: Added sinks: HDFS Agent: FtpAgent 18/05/23 17:55:05 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/23 17:55:05 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/23 17:55:05 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/23 17:55:05 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/23 17:55:05 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/23 17:55:05 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/23 17:55:05 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/23 17:55:05 INFO conf.FlumeConfiguration: Processing:hdfs 18/05/23 17:55:05 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/23 17:55:05 INFO conf.FlumeConfiguration: Post-validation flume configuration contains configuration for agents: [FtpAgent] 18/05/23 17:55:05 INFO node.AbstractConfigurationProvider: Creating channels 18/05/23 17:55:06 INFO channel.DefaultChannelFactory: Creating instance of channel MemChannel type memory 18/05/23 17:55:06 INFO node.AbstractConfigurationProvider: Created channel MemChannel 18/05/23 17:55:06 INFO source.DefaultSourceFactory: Creating instance of source ftp1, type org.keedio.flume.source.ftp.source.Source 18/05/23 17:55:06 INFO client.KeedioSource: Found previous map of files flumed: /tmp/default_file_track_status.ser 18/05/23 17:55:06 INFO sink.DefaultSinkFactory: Creating instance of sink: HDFS, type: hdfs 18/05/23 17:55:06 INFO hdfs.HDFSEventSink: Hadoop Security enabled: false 18/05/23 17:55:06 INFO node.AbstractConfigurationProvider: Channel MemChannel connected to [ftp1, HDFS] 18/05/23 17:55:06 INFO node.Application: Starting new configuration:{ sourceRunners:{ftp1=PollableSourceRunner: { source:org.keedio.flume.source.ftp.source.Source{name:ftp1,state:IDLE} counterGroup:{ name:null counters:{} } }} sinkRunners:{HDFS=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@575c0e6a counterGroup:{ name:null counters:{} } }} channels:{MemChannel=org.apache.flume.channel.MemoryChannel{name: MemChannel}} } 18/05/23 17:55:06 INFO node.Application: Starting Channel MemChannel 18/05/23 17:55:06 INFO instrumentation.MonitoredCounterGroup: Monitored counter group for type: CHANNEL, name: MemChannel: Successfully registered new MBean. 18/05/23 17:55:06 INFO instrumentation.MonitoredCounterGroup: Component type: CHANNEL, name: MemChannel started 18/05/23 17:55:06 INFO node.Application: Starting Sink HDFS 18/05/23 17:55:06 INFO node.Application: Starting Source ftp1 18/05/23 17:55:06 INFO source.Source: Starting Keedio source ... 18/05/23 17:55:06 INFO source.Source: Source ftp1 starting. Metrics: SOURCE:SOURCE.ftp1{start_time=0, last_sent=0, MbProcessed=0, files_count=0, sendThroughput=0, filesProcCount=0, countModProc=0, KbProcessed=0, filesProcCountError=0, eventCount=0, bytesProcessed=0} 18/05/23 17:55:06 INFO instrumentation.MonitoredCounterGroup: Monitored counter group for type: SOURCE, name: SOURCE.ftp1: Successfully registered new MBean. 18/05/23 17:55:06 INFO instrumentation.MonitoredCounterGroup: Component type: SOURCE, name: SOURCE.ftp1 started 18/05/23 17:55:06 INFO instrumentation.MonitoredCounterGroup: Monitored counter group for type: SINK, name: HDFS: Successfully registered new MBean. 18/05/23 17:55:06 INFO instrumentation.MonitoredCounterGroup: Component type: SINK, name: HDFS started 18/05/23 17:55:06 INFO source.Source: property workdir is null, setting to default 18/05/23 17:55:06 INFO source.Source: Actual dir: /home/volumata files: 100 18/05/23 17:55:06 INFO source.Source: Traversing element recursively: [Desktop] 18/05/23 17:55:06 INFO source.Source: Traversing element recursively: [ftp_sample] 18/05/23 17:55:06 INFO source.Source: Traversing element recursively: [Documents] 18/05/23 17:55:06 INFO source.Source: Traversing element recursively: [14-05-2018] 18/05/23 17:55:06 INFO source.Source: Traversing element recursively: [Sangeetha] 18/05/23 17:55:06 INFO source.Source: Traversing element recursively: [Algorithms_presentations] 18/05/23 17:55:06 INFO source.Source: Traversing element recursively: [Notes_took_for_presentation] 18/05/23 17:55:06 INFO source.Source: Traversing element recursively: [Analytics] 18/05/23 17:55:06 INFO source.Source: Traversing element recursively: [Churn analysis] 18/05/23 17:55:06 INFO source.Source: Traversing element recursively: [churn-analysis-functions] 18/05/23 17:55:06 INFO source.Source: Traversing element recursively: [churn-analysis-json-results] 18/05/23 17:55:06 INFO source.Source: Discovered: bank-customer-json-result.py ,size: 7493551 18/05/23 17:55:07 INFO hdfs.HDFSDataStream: Serializer = TEXT, UseRawLocalFileSystem = false 18/05/23 17:55:07 INFO hdfs.BucketWriter: Creating xxx/topics/Specified_flume_ftp_data/FlumeData.1527078307040.tmp 18/05/23 17:55:08 INFO hdfs.BucketWriter: Closing xxx/topics/Specified_flume_ftp_data/FlumeData.1527078307040.tmp 18/05/23 17:55:08 INFO hdfs.BucketWriter: Renaming xxx/topics/Specified_flume_ftp_data/FlumeData.1527078307040.tmp to xxx/topics/Specified_flume_ftp_data/FlumeData.1527078307040 ERROR source.Source: ChannelException org.apache.flume.ChannelException: Unable to put event on required channel: org.apache.flume.channel.MemoryChannel{name: MemChannel} at org.apache.flume.channel.ChannelProcessor.processEvent(ChannelProcessor.java:275) at org.keedio.flume.source.ftp.source.Source.processMessage(Source.java:404) at org.keedio.flume.source.ftp.source.Source.readStream(Source.java:376) at org.keedio.flume.source.ftp.source.Source.discoverElements(Source.java:248) at org.keedio.flume.source.ftp.source.Source.discoverElements(Source.java:196) at org.keedio.flume.source.ftp.source.Source.discoverElements(Source.java:196) at org.keedio.flume.source.ftp.source.Source.discoverElements(Source.java:196) at org.keedio.flume.source.ftp.source.Source.discoverElements(Source.java:196) at org.keedio.flume.source.ftp.source.Source.discoverElements(Source.java:196) at org.keedio.flume.source.ftp.source.Source.process(Source.java:102) at org.apache.flume.source.PollableSourceRunner$PollingRunner.run(PollableSourceRunner.java:137) at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.flume.ChannelException: Cannot commit transaction. Heap space limit of 55297reached. Please increase heap space allocated to the channel as the sinks may not be keeping up with the sources at org.apache.flume.channel.MemoryChannel$MemoryTransaction.doCommit(MemoryChannel.java:126) at org.apache.flume.channel.BasicTransactionSemantics.commit(BasicTransactionSemantics.java:151) at org.apache.flume.channel.ChannelProcessor.processEvent(ChannelProcessor.java:267) ... 11 more

lazaromedina commented 6 years ago

Hi, try FtpAgent.sources.fpt1.working.directory =/ftp_sample when ftp client is connected to server it is returnes the user's home, and after this, the configured working directory. When server is asked for parent of working directory it is returned /.

sangeesivakumar commented 6 years ago

Inside Desktop only ftp_sample folder is available. Should i have to mention like this FtpAgent.sources.fpt1.working.directory =/Desktop/ftp_sample ?

sangeesivakumar commented 6 years ago

Still it is giving all the records which is present in home by default.Is there any separate property need to be set here to point only the particular directory?

sangeesivakumar commented 6 years ago

@lazaromedina do you have any idea, how to solve this issue?

lazaromedina commented 6 years ago

Hi, can you try ftp login with you user and pass and please check line like this "Remote directory: " <---- which parent path is shown here?

Check in readme Which files will be processed? What you get from "Remote directory: " line is the parent root that is returned to the ftp-flume(client-ftp). So under that path add your folder which files you want to be processed by flume.

sangeesivakumar commented 6 years ago

I think your source will only work for user's home directory not for the specified directory what we are mentioning in working directory.

lazaromedina commented 6 years ago

Hi, you trace logs show this:

18/05/23 17:43:25 INFO source.Source: property workdir is null, setting to default
18/05/23 17:43:25 INFO source.Source: Actual dir: /home/volumata files: 31

and you config file says

FtpAgent.sources.fpt1.working.directory = /home/xxx/Desktop/ftp_sample

Do you understand that the information you are provinding is incompatible?

The working directory is the path returned to the client by the server. Your server is returning the local home user. Please, for be able to help you follow the steps and provide the information i kindly asked you before.

If you can try 
ftp your server
login
put "pwd" <--- what is showing here?

I you just want to publish your thoughts assumptions about this tool, please use other way, but do not open a issue.

best Luis.

sangeesivakumar commented 6 years ago

After login to the ftp server, if i check for present working directory it is showing /home/volumata. What should i do to point to the specified directory.

sangeesivakumar commented 6 years ago

ftp> pwd 257 "/home/volumata" is the current directory

lazaromedina commented 6 years ago

Try to set in your config woking directory: /home/volumata/path_for_flume_files

please add log trace and config file for actual test

sangeesivakumar commented 6 years ago

/home/volumata/path_for_flume_files like this only i have mentioned in my working directory property file

lazaromedina commented 6 years ago

As shown in the log, that property is not working for you, _

18/05/23 17:55:06 INFO source.Source: property workdir is null, setting to default 18/05/23 17:55:06 INFO source.Source: Actual dir: /home/volumata files: 100

_ Check the configuration file. Launch again and add the file and log

sangeesivakumar commented 6 years ago

My configuration file

Naming the components on the current agent.

FtpAgent.sources = ftp1 FtpAgent.channels = MemoryChannel FtpAgent.sinks = HDFS

Describing/Configuring the source

FtpAgent.sources.ftp1.type = org.keedio.flume.source.ftp.source.Source FtpAgent.sources.ftp1.client.source = ftp FtpAgent.sources.ftp1.name.server = 192.168.1.1 FtpAgent.sources.ftp1.user = xxx FtpAgent.sources.ftp1.password = xxx FtpAgent.sources.ftp1.port = 21 FtpAgent.sources.ftp1.flushlines = false FtpAgent.sources.ftp1.chunk.size = 1024 FtpAgent.sources.fpt1.run.discover.delay=5000 FtpAgent.sources.fpt1.working.directory = /home/volumata/Desktop

FtpAgent.sources.ftp1.filter.pattern = FlumeData.*

FtpAgent.sources.ftp1.filter.pattern = .+\.log

FtpAgent.sources.ftp1.search.recursive = false

FtpAgent.sources.ftp1.folder = /home/volumata/Desktop/ftp_sample

Describing/Configuring the sink

FtpAgent.sinks.HDFS.type = hdfs FtpAgent.sinks.HDFS.hdfs.path = hdfs://xxx/topics/flume_ftp_source FtpAgent.sinks.HDFS.hdfs.fileType = DataStream FtpAgent.sinks.HDFS.hdfs.writeFormat = Text FtpAgent.sinks.HDFS.hdfs.batchSize = 1000 FtpAgent.sinks.HDFS.hdfs.rollSize = 100 FtpAgent.sinks.HDFS.hdfs.rollCount = 100000 FtpAgent.sinks.hdfs.serializer=Text

Describing/Configuring the channel

FtpAgent.channels.MemoryChannel.type = memory FtpAgent.channels.MemoryChannel.capacity = 100000 FtpAgent.channels.MemoryChannel.transactionCapacity = 1000 FtpAgent.channels.MemoryChannel.byteCapacity = 6912212

Binding the source and sink to the channel

FtpAgent.sources.ftp1.channels = MemoryChannel FtpAgent.sinks.HDFS.channel = MemoryChannel

sangeesivakumar commented 6 years ago

18/05/24 17:28:46 INFO node.PollingPropertiesFileConfigurationProvider: Configuration provider starting 18/05/24 17:28:46 INFO node.PollingPropertiesFileConfigurationProvider: Reloading configuration file:conf/flume_ftp_source.conf 18/05/24 17:28:46 INFO conf.FlumeConfiguration: Added sinks: HDFS Agent: FtpAgent 18/05/24 17:28:46 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/24 17:28:46 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/24 17:28:46 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/24 17:28:46 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/24 17:28:46 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/24 17:28:46 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/24 17:28:46 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/24 17:28:46 INFO conf.FlumeConfiguration: Processing:hdfs 18/05/24 17:28:46 INFO conf.FlumeConfiguration: Processing:HDFS 18/05/24 17:28:46 INFO conf.FlumeConfiguration: Post-validation flume configuration contains configuration for agents: [FtpAgent] 18/05/24 17:28:46 INFO node.AbstractConfigurationProvider: Creating channels 18/05/24 17:28:46 INFO channel.DefaultChannelFactory: Creating instance of channel MemoryChannel type memory 18/05/24 17:28:46 INFO node.AbstractConfigurationProvider: Created channel MemoryChannel 18/05/24 17:28:46 INFO source.DefaultSourceFactory: Creating instance of source ftp1, type org.keedio.flume.source.ftp.source.Source 18/05/24 17:28:47 INFO client.KeedioSource: Found previous map of files flumed: /tmp/default_file_track_status.ser 18/05/24 17:28:47 INFO sink.DefaultSinkFactory: Creating instance of sink: HDFS, type: hdfs 18/05/24 17:28:47 INFO hdfs.HDFSEventSink: Hadoop Security enabled: false 18/05/24 17:28:47 INFO node.AbstractConfigurationProvider: Channel MemoryChannel connected to [ftp1, HDFS] 18/05/24 17:28:47 INFO node.Application: Starting new configuration:{ sourceRunners:{ftp1=PollableSourceRunner: { source:org.keedio.flume.source.ftp.source.Source{name:ftp1,state:IDLE} counterGroup:{ name:null counters:{} } }} sinkRunners:{HDFS=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@6b4b7fc2 counterGroup:{ name:null counters:{} } }} channels:{MemoryChannel=org.apache.flume.channel.MemoryChannel{name: MemoryChannel}} } 18/05/24 17:28:47 INFO node.Application: Starting Channel MemoryChannel 18/05/24 17:28:48 INFO instrumentation.MonitoredCounterGroup: Monitored counter group for type: CHANNEL, name: MemoryChannel: Successfully registered new MBean. 18/05/24 17:28:48 INFO instrumentation.MonitoredCounterGroup: Component type: CHANNEL, name: MemoryChannel started 18/05/24 17:28:48 INFO node.Application: Starting Sink HDFS 18/05/24 17:28:48 INFO node.Application: Starting Source ftp1 18/05/24 17:28:48 INFO source.Source: Starting Keedio source ... 18/05/24 17:28:48 INFO source.Source: Source ftp1 starting. Metrics: SOURCE:SOURCE.ftp1{start_time=0, last_sent=0, MbProcessed=0, files_count=0, sendThroughput=0, filesProcCount=0, countModProc=0, KbProcessed=0, filesProcCountError=0, eventCount=0, bytesProcessed=0} 18/05/24 17:28:48 INFO instrumentation.MonitoredCounterGroup: Monitored counter group for type: SINK, name: HDFS: Successfully registered new MBean. 18/05/24 17:28:48 INFO instrumentation.MonitoredCounterGroup: Component type: SINK, name: HDFS started 18/05/24 17:28:48 INFO instrumentation.MonitoredCounterGroup: Monitored counter group for type: SOURCE, name: SOURCE.ftp1: Successfully registered new MBean. 18/05/24 17:28:48 INFO instrumentation.MonitoredCounterGroup: Component type: SOURCE, name: SOURCE.ftp1 started 18/05/24 17:28:48 INFO source.Source: property workdir is null, setting to default 18/05/24 17:28:48 INFO source.Source: Actual dir: /home/volumata files: 30 18/05/24 17:28:48 INFO source.Source: Discovered: FlumeData.1526384902929 ,size: 681576 18/05/24 17:28:48 INFO hdfs.HDFSDataStream: Serializer = TEXT, UseRawLocalFileSystem = false 18/05/24 17:28:48 INFO hdfs.BucketWriter: Creating hdfs://xxx:8020/topics/flume_ftp_source/FlumeData.1527163128257.tmp 18/05/24 17:28:48 INFO source.Source: Processed: FlumeData.1526384902929, total files: 31

18/05/24 17:28:48 INFO source.Source: Discovered: avro-tools-1.8.2.jar ,size: 34795750 18/05/24 17:28:50 INFO hdfs.BucketWriter: Closing hdfs://xxx:8020/topics/flume_ftp_source/FlumeData.1527163128257.tmp 18/05/24 17:28:50 INFO hdfs.BucketWriter: Renaming hdfs://xxx:8020/topics/flume_ftp_source/FlumeData.1527163128257.tmp to hdfs://xxx:8020/topics/flume_ftp_source/FlumeData.1527163128257

sangeesivakumar commented 6 years ago

why working directory property is not working for me?

sangeesivakumar commented 6 years ago

@lazaromedina can you identify where i went wrong?

lazaromedina commented 6 years ago
....
FtpAgent.sources.fpt1.working.directory = /home/volumata/Desktop
....
....
18/05/24 17:28:48 INFO source.Source: property workdir is null, setting to default
18/05/24 17:28:48 INFO source.Source: Actual dir: /home/volumata files: 30
....

If the config file corresponds with trace log you are setting a working directory, but client ftp is returning null, so he goes up one level to the default path root that ftp server is returning.

Can you acces by a simple ftp connection to Desktop/ ? : Try loging ftp server from console o with graphical client

ftp server
...
cd Desktop
...

is Ftp server giving access rights to "Desktop/" ?

sangeesivakumar commented 6 years ago

Yes i have already checked that wheather i have permission for that through console.

ftp> cd Desktop/ 250 Directory successfully changed. ftp>

lazaromedina commented 6 years ago

I am trying to reproduce you problem but not being successfull. I think you have some problem with config file, because allthough property working.directory seems to be set i see null in the log.

sangeesivakumar commented 6 years ago

what is the problem in my config file? i have mentioned my config file above, where it went wrong? where is the problem ? why i am getting null in log?

sangeesivakumar commented 6 years ago

where is getWorkingDirectory() function in your source code?

lucarosellini commented 6 years ago

@sangeesivakumar it seems the name of two of your properties are wrong in a very subtle way.

You have:

FtpAgent.sources.fpt1.run.discover.delay=5000
FtpAgent.sources.fpt1.working.directory = /home/volumata/Desktop

the name of the source here is fpt1, which is incorrect, it should be ftp1. Do you see the difference?

All of the other properties you have seem to be correct.

Could you please try fixing this and tell us if this works?

Luca

sangeesivakumar commented 6 years ago

@lucarosellini Your right. That was my mistake, i am sorry. It got resolved now and pointing to specified directory which is given in working.directory property. Great!! Thanks for identifying my silly mistakes