Closed dudipsh closed 4 years ago
@dudipsh Backward going chat with uncertain size messages is ultimately important use case. I dreamed last months about a demo implementing it. I believe the requirements you put in the issue description can be fulfilled with existing API. Let me think how we can do it. What if I suggest developing a demo application on stackblitz and ask you to participate in testing and improving it?
@dhilt thanks for the quick answer i think it will be amazing to open the stackblitz for this use case and i will help with the improving and testing
@dudipsh Hello! I started the demo: https://stackblitz.com/edit/ngx-ui-scroll-chat-app Please take a look and it would be great if you could confirm we are on the right way and could help to go further.
Main ideas of the demo are
Adapter.append
method; in production the newMessages$ should be triggered automatically when new messages are arrived via sockets or whatever; in dev this is emulated by doAppend
methodisLoading$
observable which value should be switched to false
and taken onceThe implementation is really raw. Especially caching. But as a start point it seems work. My plan is to discuss current implementation, improve it, cover more cases and re-make the demo app to be more like the Chat App. And it would be great if you fork it at some point and we can do this together.
Hey @dhilt First of all i wanted to thank you for taking the time to build this demo.
From the looks of it I think we are definitely in the right direction, I liked your code design and I'm planing on using it. Also this demo will answer alot of peoples queries.
I am planing on
I am going to start working on this immediately and I will update as i go along.
@dudipsh great! The most important thing I would love to have in the end is a separate data-connection layer with fixed API that could be re-used across the different apps with random database behind. I'm talking about MessageService. Its implementation is app specific, but it's API should be common and fixed. Right now MessageService API implies two agreements:
index
and count
I'm open to discuss this API, change it, make it more generic and convenient.
Also, I would not recommend to call requestData
or its analogue on the app component start (ngOnInit). Instead, the required logic should be incapsulated inside MessageService.requestData
method. The point is the the scroller will trigger Datasource.get
method on the app component start with appropriate index
and count
params (generally, get
will triggered for more than 1 times to fill out the viewport). And if you need to run some specific data/channel initialization procedure (for example once per app component initial render), just do it inside MessageService.requestData
method and merge the result with real index-count request using appropriate RxJs technique.
Last, I'd recommend to remove cache logic before the initial implementation is done. This is a sort of improvement we can live without. And I'd like to have caching also as a separate switchable layer because there may be solutions that already have caching (for example HTTP Interceptor caching in case the data is accessible via CRUD).
Hey @dhilt
In order to simplify things I decided to change my backend from GraphQL to a rest API.
@Get()
findAll(@Query() paginationDto: PaginationDto): Promise<PaginationMessageResultDto> {
paginationDto.page = Number(paginationDto.page)
paginationDto.limit = Number(paginationDto.limit)
return await this.chatService.readMessageByChannelByDate(paginationDto);
}
async readMessageByChannelByDate(data: any) {
const {
page,
limit,
channelId,
} = PaginationMessageResultDto;
const skippedItems = (page - 1) * limit;
const totalCount = await this.messageModel.find({ channelId }).countDocuments();
const messages = await this.messageModel.find({channelId}).skip(skippedItems).limit(limit).sort('-ts')
return {
totalCount,
page,
channelId,
limit,
data: messages,
};
}
}
You can see here that I return all the messages with pagination and I sort the data from the last incoming item. Now when the component is loaded the the first item appears on top instead of the desired bottom. Not being able to see a example with live API data server makes it harder for me to put things together correctly.
I created a repo I would appreciate it if you could take a look at it
I can look, I can't run due to no mock data provided. Would be great if this worked by just opening: https://stackblitz.com/github/dudipsh/ngx-ui-scroll-chat.
Also, I'm not sure if switching to page-limit data flow might make life simpler, because you need to take care of converting page-limit paradigm to index-count. The example of such conversion can be taken from https://dhilt.github.io/ngx-ui-scroll/#/datasource#pages. Anyway, the code base grows, and without mocking of the data requests it will consume too much time to achieve practical results. But in theory you need just pass index-count data arrays to the UiScroll Datasource initialized with "reverse" setting, and start index must be the same as minimal index (I believe -10 does more harm than good).
Thanks for the link i will look into it
As for the stackblitz Unfortunately the API doesn't work with it. so i updated the url in the chat api service to my dev server and to make it work you need to run it locally in addition i added a mock in /assets/mock.json
@dhilt i added orderBy param to make it easy to debug ( ts or -ts) you can try via postman http://graphql.vdi.co.il/message?page=1&limit=5&orderBy=ts&channelId=5edaa34844a52a4e8a2159d9
There should be an https-based API available, because stackblitz works through https connection. The problem is described here. Anyway, if you are going to go prod at some point, you'll need https for API server, and it would be great if you could also provide https for dev API server.
Another option I see is just to mock all the requests in dev environment via http-interceptor or via mock service intercepting data requests before making http call (like RemoteService in my demo).
Obviously, this is my testing environment i change to https and now the stackblitz works 👍
I started developing a specification for Chat App and the first pat is ready for use: https://github.com/dhilt/ngx-ui-scroll/wiki/Chat-App
It helped me to look at the concept and the requirements and more precisely, so I'm ready to propose the second version of the demo App which seems more powerful and clean: https://stackblitz.com/edit/ngx-ui-scroll-chat-app
Wiki has almost no clue with the demo, but 3 scenarios described in wiki are 3 requirements of MessageService:
MessageService.initialize()
methodMessageService.requestData(index, count)
methodMessageService.newMessages$
observableWith new initialization approach (connect scenario) we may have random items on start. This is managed via AMOUNT
and AMOUNT_INIT
properties of the Remote service. This also means that the App would want a specific data object on connect: { items, total }. See wiki and demo for details.
The most complex part of the demo is the implementation of "push" scenario. As you may see, I extracted +1/5/100 handler outside the Chat component and it calls the Remote service. This way I've broken the explicit two-way chain and made uni-directional: Remote service produces new item(s) and fires appropriate event, Message service handles this event and triggers newMessages$
observable, Chat component gets new item(s) and processes them. This is basically backward-push scenario, when the remote channel storage receives new message(s) from some random user.
Also, I simplified caching. It will be used only if the entire request can be fulfilled with Cache. No merges. And ngx-ui-scroll v1.7.0 is used, it has the Adapter.relax
method which allows to simplify the isLoading
approach.
It would be great if you could re-use the demo, replace the remote infrastructure with gQL and re-implement basic Message service entities in accordance with the requirements described here. If you could follow this approach, I would help more efficiently. Also, note https://stackblitz.com/github/dudipsh/ngx-ui-scroll-chat must work, currently it doesn't (401, despite https).
@dhilt Thank you it looks like you've invested alot, I'll take a look at it once my laptop comes back from the lab.
i fix the server (401) now messages arrive https://stackblitz.com/github/dudipsh/ngx-ui-scroll-chat
In the meantime, I'm going over your new documentation https://github.com/dhilt/ngx-ui-scroll/wiki/Chat-App
Hey @dhilt
I read the documentation and maybe I'm misunderstanding something but it seems to me that there is no way to fetch data by groups from the BE.
Currently if I bring all the messages at once it works well but the desired behavior that I'm looking for is that at initializing I want to bring the messages in groups and not all at once, for example if the channel has 100 messages I want to be able to fetch 30 messages at a time. if user scrolled to item 30 fetch more 30...
i cleaned the code here https://stackblitz.com/github/dudipsh/ngx-ui-scroll-chat
@dudipsh I believe we are close to the end result in your demo.
First, the initialization of the channel seems excessive and could be simplified as follows
// chat.component.ts
async ngOnInit() {
this.activeChannel = '5edaa34844a52a4e8a2159d9';
await this.messageService.initialize(this.activeChannel);
if (this.messageService.lastIndex >= 0) {
this.isInitialized = true;
}
}
This way you request first data page only once, the result is cached at the messageService layer and will be available for the ChatListItemsComponent
with no latency.
Second, the page-limit API requires additional transformation layer between MessageService.request
and ChatService.getMessages
methods. My guess is the best place for this layer is the body of MessageService.request
method after cache logic. This is the most challengeable update, so let me dive into it
// assume start/end indexes are based on index/count with boundaries
const startPage = Math.floor(startIndex / limit); // limit = 30 in your case
const endPage = Math.floor(endIndex / limit);
const pageRequests = []; // store the requests
for (let i = startPage; i <= endPage; i++) {
pageRequests.push(
this.remoteService.getMessages(this.channelId, i)
.pipe((map(res => res.data))) // only items arrays are needed
);
}
return forkJoin(...pageRequests).pipe(map(pagesResult => {
pagesResult = pagesResult.reduce((acc, pageResult) =>
[...acc, ...pageResult] // flattening the response
, []);
// very important: cache indexes are transformed back from pages to index-count
pagesResult.forEach((item, i) =>
this.persist(startPage * limit + i, item)
);
// slicing the pages result to provide only what the Datasource wants
const start = startIndex - startPage * limit;
const end = start + endIndex - startIndex + 1;
return pagesResult.slice(start, end);
}));
The approach is taken from this demo (async tab). With your settings you will not meet a situation when such merging happens. But if you set the limit
value close to the bufferSize
value (for example both are 10, or just reduce limit
to 5), you will see how it works.
Last, I forked your demo, applied the above suggestions, provided additional minor API changes and cleaned it up: https://stackblitz.com/edit/ngx-ui-scroll-chat-dudipshn. I like the result! But there is no push scenario, it may take significant efforts to implement it.
@dhilt , I know that the push scenario is complex and I appreciate the efforts. While we are at it i would like to also propose that we take into consideration other CRUD behaviors such as EDIT and DELETE messages.
I took a look at the stackblitz link, and for a unforeseeable reason the first and last massages are in place but the middle massages are brought in a random order.
@dudipsh Well, let's take 2 requests:
https://graphql.vdi.co.il/message?channelId=5edaa34844a52a4e8a2159d9&page=0&limit=5&orderBy=ts
https://graphql.vdi.co.il/message?channelId=5edaa34844a52a4e8a2159d9&page=1&limit=5&orderBy=ts
The first one gives: [divider, First, a1, a2, a3]
. I'm awaiting [a4, a5, a6, a7, a8]
as the result of the second one, because page
is incremented. But the second request gives [First, a1, a2, a3, a4, a5]
.
How it is possible? Does page
parameter work as index
? Could it be switched to "page"? Currently this demo App expects page-limit approach as a remote API.
I changed it to a page limit approach, now it should work as you've expected.
BE
const skippedItems = (page ) * limit;
return await this.messageModel
.find({ channelId })
.skip(skippedItems).limit(limit ).sort(orderBy);
@dhilt Hey, the channelId has changed to "5eec5bf61cfa5bd2b297a496" https://stackblitz.com/edit/ngx-ui-scroll-chat-dudipshn.
@dhilt Any update regarding pushing messages? I was able to partially succeed but its very buggy, the messages seem to disappear after scrolling them out of view, meaning when i scroll back they no longer exist. I tried adding the possibility of pushing messages in stackblitz but unfortunately its not acting nice with the GraphQL, but you can still see what i mean in the link below.
up
@dudipsh I looked in your code, it is almost fine. When you push new message to a channel, the Remote Service should notify the App that this message successfully flew away and can be added to the list (UI). So at the Component level it should look like
sendMessage() {
this.remoteService // this.chatService
.sendMessage(this.activeChannel, new Message(this.text))
.subscribe(() => this.text = ''); // clean up the text in response to success
// also, need to handle error case
}
And at the Remote Service level we should have something like
sendMessage(channelId, message) {
this.sendMessageToChannel(channelId, message) // this.apollo.mutate().pipe
.subscribe(() => {
this.newData$.next([message]);
})
}
The main difference is that RemoteService.newData$
must be the entry point of the process of emitting the result message, not MessageService.onPush$
. The code of MessageService.initialize
provides re-emitting along with necessary caching of the result message(s). If you need to emulate pushing, just call this.newData$.next([message])
and return of(true)
without DB/apollo stuff.
@dhilt I'm pleased to inform you that we finally have lift off. Everything seems to be working smoothly and i don't see any bugs related to this. Thank you so much for your support through this, you are great.
@dudipsh Great! I will continue my wiki document for Chat App that was started thanks to this thread.
Hello
@dhilt I would like to use your library to build a chat app, the requirements are
1. auto row height
2. start from bottom
3. virtual+infinity scroll implemented with GraphQL (fetch more).
From your experience do you know if this library can tackle these requirements? I went through the documentation and the available demo's and couldn't put together a solution. Can you please point me in the right direction.
my old code look like this
thanks in advance best regards