patternjavascriptMinor
jQuery Lifestream Plug-in
Viewed 0 times
lifestreamplugjquery
Problem
This is a plug-in which tracks your online activity aka a lifestream. You're currently able to use feeds like:
At first I decided to load all the feeds at the end of all the asynchronous requests. This only required one DOM request at the end.
The problem with that approach is that it just takes too long to load. Some requests timed-out and others just took a while.
What I'm doing right now is changing the DOM after each feed request (e.g. twitter). At the moment this means 8 DOM changes.
What would be the best practice in this situation? Doing a setTimeout and accessing the dom after e.g. 400ms? Or would you leave it as it is now?
If you have any other remarks or suggestions, those would be appreciated as well.
`/**
* Initializes the lifestream plug-in
* @param {Object} config Configuration object
*/
$.fn.lifestream = function(config){
var outputElement = this;
// Extend the default settings with the values passed
var settings = jQuery.extend({
"classname": "lifestream",
"limit": 10
}, config),
data = {
"count": settings.list.length,
"items": []
};
var finished = function(inputdata){
$.merge(data.items, inputdata);
data.items.sort(function(a,b){
if(a.date > b.date){
return -1;
} else if(a.date === b.date){
return 0;
} else {
return 1;
}
});
var div = $('');
var length = (data.items.length '
+ data.items[i].html + "");
}
}
outputElement.html(div);
}
var load = function(){
// Run over all the items in the list
for(var i=0, j=settings.list.length; i
Complete code:
Documentation:
- Delicious
- Flickr
- Github
- Google Reader
- Last.fm
- Stackoverflow
- Youtube
At first I decided to load all the feeds at the end of all the asynchronous requests. This only required one DOM request at the end.
The problem with that approach is that it just takes too long to load. Some requests timed-out and others just took a while.
What I'm doing right now is changing the DOM after each feed request (e.g. twitter). At the moment this means 8 DOM changes.
What would be the best practice in this situation? Doing a setTimeout and accessing the dom after e.g. 400ms? Or would you leave it as it is now?
If you have any other remarks or suggestions, those would be appreciated as well.
`/**
* Initializes the lifestream plug-in
* @param {Object} config Configuration object
*/
$.fn.lifestream = function(config){
var outputElement = this;
// Extend the default settings with the values passed
var settings = jQuery.extend({
"classname": "lifestream",
"limit": 10
}, config),
data = {
"count": settings.list.length,
"items": []
};
var finished = function(inputdata){
$.merge(data.items, inputdata);
data.items.sort(function(a,b){
if(a.date > b.date){
return -1;
} else if(a.date === b.date){
return 0;
} else {
return 1;
}
});
var div = $('');
var length = (data.items.length '
+ data.items[i].html + "");
}
}
outputElement.html(div);
}
var load = function(){
// Run over all the items in the list
for(var i=0, j=settings.list.length; i
Complete code:
- https://github.com/christianv/jquery-lifestream/blob/master/jquery.lifestream.js
Documentation:
- https://github.com/christianv/jquery-lifestream#readme
Solution
At first I decided to load all the feeds at the end of all the asynchronous requests. This only required one DOM request at the end. The problem with that approach is that it just takes too long to load. Some requests timed-out and others just took a while.
It's unclear to me why the requests are timing out, but you could simply store the results of the responses in memory as opposed to what you might be doing; synchronization logic is not necessary, you only need to know when all requests ended. However, the downside of this approach is that time outs can make your application show nothing to your visitors, which is something you really don't want to do...
What I'm doing right now is changing the DOM after each feed request (e.g. twitter). At the moment this means 8 DOM changes.
It feels to me that 8 DOM changes are trivial to do by your browser; but the thing to do here is to actually figure out how many requests your code needs to be capable of handling.
A bad option would be to update the DOM on each X-th response, because you are introducing the timed-out problem again. So, a better option is adding your responses to a queue and updating the DOM each X ms. That way you have the best of both worlds, and it doesn't matter how much feeds you'll be adding because it scales pretty well. There might probably be other problems at the high end, like updating too much DOM at once, but that's out of the question here...
It's unclear to me why the requests are timing out, but you could simply store the results of the responses in memory as opposed to what you might be doing; synchronization logic is not necessary, you only need to know when all requests ended. However, the downside of this approach is that time outs can make your application show nothing to your visitors, which is something you really don't want to do...
What I'm doing right now is changing the DOM after each feed request (e.g. twitter). At the moment this means 8 DOM changes.
It feels to me that 8 DOM changes are trivial to do by your browser; but the thing to do here is to actually figure out how many requests your code needs to be capable of handling.
A bad option would be to update the DOM on each X-th response, because you are introducing the timed-out problem again. So, a better option is adding your responses to a queue and updating the DOM each X ms. That way you have the best of both worlds, and it doesn't matter how much feeds you'll be adding because it scales pretty well. There might probably be other problems at the high end, like updating too much DOM at once, but that's out of the question here...
Context
StackExchange Code Review Q#2535, answer score: 3
Revisions (0)
No revisions yet.