Closed GoogleCodeExporter closed 9 years ago
Add "limit:###" to your query. Example query: "Host='127.0.0.1' limit:1000"
Original comment by sitko.ma...@gmail.com
on 24 Jul 2012 at 9:48
Yes, add limit:2000 (or whatever) to your query to get more results back. The
reason this is somewhat hidden is that the preferred method is to add terms to
your search to return fewer results. Most searches can be narrowed by adding
term negations to reduce the number of results, such as +term1 -term2 -term3.
Original comment by mchol...@gmail.com
on 25 Jul 2012 at 1:23
Is it possible to change the default records from 1000 to something else?
Original comment by jacobrav...@gmail.com
on 4 Mar 2013 at 10:30
Do you mean change from the default of 100? Or do you mean retrieve more
than 1000? You can get up to 9999 bet setting the limit manually, and you
can get more than that if you set the limit higher, but it will execute as
a batch query.
On Mon, Mar 4, 2013 at 4:30 AM, <
enterprise-log-search-and-archive@googlecode.com> wrote:
Original comment by mchol...@gmail.com
on 4 Mar 2013 at 2:13
[deleted comment]
[deleted comment]
[deleted comment]
Yes i mean how to change the default of 100 to a default of ex. 1000 or 2000
Thanks
Original comment by jacobrav...@gmail.com
on 5 Mar 2013 at 7:59
I've updated docs with an FAQ to address this question here:
https://code.google.com/p/enterprise-log-search-and-archive/wiki/FAQ .
On Tue, Mar 5, 2013 at 1:59 AM, <
enterprise-log-search-and-archive@googlecode.com> wrote:
Original comment by mchol...@gmail.com
on 5 Mar 2013 at 8:19
Hello!
Isn't it possible to change the standard 100 to something else so that users
will not need to use the limit keyword in searches?
Original comment by johan.br...@gmail.com
on 22 May 2013 at 7:23
No, it isn't. As per the entry I created on the FAQ, you should not normally
be asking for more than 100 results back in a search. If there's a specific
use case I haven't seen before that requires that all the time, please let me
know.
Original comment by mchol...@gmail.com
on 22 May 2013 at 2:02
How do you suggest getting an overview of logs from a big scale network? I am
referring to logs from network devices such as routers and switches. When
lacking sufficient support from SNMP there is a need to be able to notice
unknown type of messages/logs previous to something actually braking. I have
often come across logs indicating errors that no one was expecting and didn't
get a notification about. To find such I would like the ability to skim through
logs (more than 100 at a time) to see if I can detect something out of the
ordinary.
Other ways of attacking the problem:
* Add an option to find the next or previous (buttons) 100 logs (in chronology)
* Search/display logs from the current time and backwards (and using a limit
well over 100), I can't seem to find such an option, Since I need to set the
start time otherwise the logs are limited to the 100 first logs that was
recieved since I installed ELSA.
Please correct me if I'm wrong, I have just been testing your tool for a couple
of days.
Original comment by johan.br...@gmail.com
on 22 May 2013 at 2:28
The best way to find unusual messages from a given host is to start adding term
negations. Take Cisco messages, for example. You start with a host, (e.g.
192.168.1.1) and the unparsed logs, and start removing things:
host:192.168.1.1 class:none
host:192.168.1.1 class:none -loaded
host:192.168.1.1 class:none -loaded -configured
To get a good overview, the best way of finding hosts by class is to use the
(currently undocumented) _node_stats datasource like this:
datasource:_node_stats groupby:host class:none
Original comment by mchol...@gmail.com
on 22 May 2013 at 5:20
Well, what if there is no given host then? Say I have 500 switches that I need
to check logs for. I don't want to browse through all IPs one by one to see if
there is any unusual going on. I would rather take 2000-5000 lines of log from
them at a time and just scroll through. Preferably from current time and then
step backwards.
Original comment by johan.br...@gmail.com
on 22 May 2013 at 5:34
You could do a bulk query using archive:1 limit:0 to get a raw dump of the logs
within the given timeframe as a bulk file, but I recommend trying a class:none
groupby:program query first to see the unique program names of logs that
haven't been classified. What you are essentially looking for is a log anomaly
detector, which isn't currently something ELSA does.
Original comment by mchol...@gmail.com
on 22 May 2013 at 6:18
I must say, using ELSA for network syslog, is very fast way to search logs. BUT
i very much need to have a default limit for more than 100 and a default lines
pr page more than 15. I know you can set this in the query. But please listen
to your customers, we need more lines per default when troubleshooting network
syslogs.
Please !! :-)
Original comment by jacobrav...@gmail.com
on 21 Feb 2014 at 7:52
Added config settings and preferences for limit and rows_per_page in rev 1185.
Individual users can set these in preferences, default_settings. See docs for
details.
Original comment by mchol...@gmail.com
on 23 Mar 2014 at 10:47
Original issue reported on code.google.com by
vamsi.ma...@gmail.com
on 24 Jul 2012 at 7:49