Friday, March 28, 2008

Web 2.0 Performance Optimization - Images

The work that I've been doing around performance optimization for the WebLogic Portal Web 2.0 Playground site has made me realize just how truly bloated things have become. In the early 1990's I was building electronic brochures for PCs running DOS with 640k memory and 256 color VGA displays. This was before most people had ever heard of the web, and things like Flash, AJAX, and so on were unheard of. These apps certainly didn't do as much or look as nice as modern web-based apps, but one thing that they were was efficient! We spent a lot of our time optimizing the images and code to ensure that everything would fit onto a 1.4M floppy disk and run correctly within the tight confines of the PC.

So how does this ancient history relate to what I'm doing now? I've been doing some of the same type of optimization in order to make our site run better. While there are relatively few restrictions on modern computers compared to those ancient PCs, there are limits in the form of available bandwidth. If you've read my previous entries you've seen what a difference can be made by optimizing the resources for the application based on the restrictions imposed by the medium. Call me crazy, but I think that this sort of thing is fun, and I have to wonder how much better all software could be if it weren't so bloated.

PNG Optimization

I became a fan of the Portable Network Graphics (PNG) format when I realized it was lossless, unlike JPEG, and supported transparencies and other features better than GIF. Most modern browsers and tools support it well, and there are some nice utilities available. It's not suitable for everything, and in fact I've gone back to JPEGs for some large backgrounds, but it's arguably the best all-around format today.

A problem with PNG is that many image editing programs don't do a good job of optimizing the files. They include extraneous information when they save the files, and if I had to guess it is because they take a general purpose approach and save everything. This is fine for editing and sharing, but the extra bloat is not helpful when deploying these images in a web application.

The good news is that there are several utilities that can be used to optimize PNGs, and I would recommend that everyone doing web development have one in their arsenal. If you are interested in the technical details check out A guide to PNG optimization by Cosmin TruĊ£a. After trying a couple of the free and online tools, I found that I really needed good batch mode support and ended up purchasing PNGOUTWin from Ardfry Imaging, LLC. I am not affiliated with them in any way, but I do think that they have a great product that is well worth the price.

So how good is the compression? It really depends on the image, but 10-30% seems to be the average range. For our site the total savings for the PNGs used in various look and feels ranges from 8% to 25%, although the actual savings are higher as some of the larger PNGs are replaced with JPEGS. This might not sound like a lot, but when all of the images that are on a page are taken into consideration it adds up quickly.

Optimizing Other Formats

For the sample look and feels most of the images are PNGs, but there are a few GIFs in use for the titlebar images. These are typically fairly small and won't benefit much from optimization. If you do want to optimize the GIFs, most paint programs offer options for reducing the number of colors.

As stated earlier, you may wish to use JPEGs for things like photographs, large header graphics, backgrounds, and so on. JPEGs will typically be smaller, and offer lots of compression options if you're not picky about the quality. There are some cases where a PNG will be smaller, but these are typically mostly empty. Be careful when working with JPEGs if you think you might want to edit the images later, as once they're saved as JPEGs they are often mangled by the compression. The various JPEG cleanup tools only go so far, and you may spend time using pixel-level editing tools trying to restore images, which is not fun!

I'm still going to go over CSS optimization and some other things in this area, but my next entry is probably going to dive into Disc and REST a bit more, and will include a new way for you to play with them live on wlp.bea.com. Stay tuned!

Thursday, March 20, 2008

And Now For Something Completely Different

It seems like a long, long time ago that I first wrote about the sample RESTcommands that we had made available on wlp.bea.com, but in fact it was less than a year. It's amazing how much has happened since then for me, WebLogic Portal, and of course BEA.

A Quick Refresher

My blog entry The REST of the Story includes an introduction to REST, and here are some of the links again if you'd like to read a bit about it:

There are others, and there is naturally lots of discussion and debate over how RESTful something is. I have always been more interested in the practical application of ideas and technologies rather than the theory, so I try to avoid getting into these debates. What I do know is that REST-style commands work very well for Web 2.0 applications, and I like not having to use heavyweight server-side Java for everything.

A Web 2.0 Demonstration

You can see the REST commands in WLP 10.2 in action by visiting the Dynamic Visitor Tools Sample site. Follow the link on that page and you'll see this portal desktop:

dvt_desktop.png

You can create an account and login very easily, and when you do so it will enable the DVT functionality for things like drag-and-drop, adding portlets/pages/books, and changing the look and feel, layout, menus, and so on. What you might not be aware of is that nearly everything you see is powered by a combination of the WebLogic Portal REST API, the WLP Disc Framework, the Dojo Toolkit, and the Dynamic Visitor Tools Sample code.

Patterns

A visitor to this site will push buttons, make selections, drag and drop items, use inline editing, and so on as part of a Web 2.0 style interactive experience. A common pattern is used for nearly everything in the DVT, illustrated here:

rest_flow.png

We'll look into Disc more in an upcoming entry, for now we'll start by saying that it makes it easy to get to the client-side representations for portal objects. The DVT sample uses Disc to get information about the portal desktop, including the various labels, titles, DOM nodes, and so on that make them up. Without Disc a portal page is just a collection of DIVs and other HTML tags, and in the past developers often had to invent their own solutions for mapping these to the server-side definitions and instances. Not impossible, but it wasn't always easy and it meant custom solutions that might not interoperate or upgrade well.

Try It!

One thing that makes REST interesting is that it is very easy to use; in fact you can try it out without writing a single line of code. I will suggest that you use Mozilla Firefox with the Firebug add-on as you can do a lot with the console, but any browser will at least let you try the basics. Try the following URL:

http://wlp.bea.com/dvt/bea/wlp/api/portlet/list?webapp=dvt

This will return a list of available portlets in the dvt web app, which will look something like this:

rest_portlet_list.png

The XML is fairly straightforward:

  <rsp>
<portlet_summaries> - Array of portlet summarys
<portlet_summary> - Summary for a portlet
<label>portlet_1</label> - The unique label for the portlet
<title>My Portlet</title> - The title to display
<icon>portlets/icons/myportlet.png</icon> - The optional icon
<description>The portlet</description> - The optional description </portlet_summary>
...
</portlet_summaries>
</rsp>   


This list can be used to display a list of portlets, as in this example from the DVT Gallery:


dvt_portlet_list.png

A Closer Look


While XML is great for many things, it's not really great on the client side, especially when going cross-browser. It's often easier to use JSON, JavaScript Object Notation, which is fully supported by the WLP REST API. Simply include the argument format=json and you'll get a response that contains JSON, which is easily used on the client. Here is a screenshot from Firebug showing the arguments being used by the DVT to build the gallery listing above:


rest_portlet_list_params.png

These include:




  • desktop: dvt - The portal desktop, from Disc


  • format: json - Return the results as JSON


  • max: 200 - Return a maximum of 200 portlet summaries


  • portal: demo - The portal, from Disc


  • scope: visitor - Can be visitor, admin, or library


  • start: 0 - Start with the first portlet


  • webapp: The webapp to get the portlets for, from Disc



This will return the following:


rest_portlet_list_response.png

This is fairly dense, but if you format it you can see that it is similar to the XML:



  {
"content": {
"portlet_summaries": [
{ "title":"My Portlet Customers","label":"portlet_1","icon":"portlets/icons/myportlet.png" },
...
]
}
}   


If you haven't already used JSON it might not be obvious why this is so much easier than XML. With JSON you can use the JavaScript eval function (or a nice wrapper for it as provided by Dojo and other toolkits) and get a JavaScript object back. This takes just a few lines of code:



  var result = eval("(" + XMLHttpRequest.responseText + ")");
var content = result.content;
var portletSummaries = content.portlet_summaries;
for (var i = 0; i < portletSummaries.length; i++) {
var title = portletSummaries[i].title;
var label = portletSummaries[i].label;
...   


More Examples


You can use the same pattern to get lists of most of the portal resources, using the following pattern:


<protocol>://<host>:<port>/<webapp>/bea/wlp/api/<type>/<action>/<label>?<params>

This is covered in more detail on our edocs site in The WebLogic Portal REST API , which is a good follow-up to this blog entry. If you'd like to explore some more, here are some of the types and actions you can try, all starting with http://wlp.bea.com/dvt/bea/wlp/api/:




  • portlet/details/<label>?webapp=dvt - Where label is from the portlet summary


  • lookandfeel/list?webapp=dvt


  • page/list?webapp=dvt


  • book/list?webapp=dvt


  • menu/list?webapp=dvt


  • theme/list?webapp=dvt



As you've probably noticed, so far the focus has been on reading from the server via HTTP GETs, but there is a lot more you can do using POSTs. The DVT sample uses this to change the look and feel, add portlets to a page, and so on, using the data from the GETs to create the various parameters. You can't really explore POSTs using URLs in the browser, but you can do so with a small amount of code. One thing to note is that you will need to be authenticated in order to do anything interesting, and you can only do the things that the server security allows.



I've got a set of sample portlets that I've been working on that duplicate much of the functionality in the DVT using "plain" JavaScript, with no reliance on Dojo, etc. All you need to do is have Disc enabled in a WebLogic Portal 10.2 desktop. They aren't meant to be examples of how to write portlets so much as they are demonstrations of using REST and Disc, and can be used to learn how to use these technologies with most any framework. I hope to make these available soon, but if you can't wait drop me a line and I'll send you the current set.



More to come on this, and I'll continue with the optimization series as well.

Wednesday, March 19, 2008

Web 2.0 Performance Optimization - JavaScript

In the two previous entries we have looked at some of the ways that we can optimize a site for Web 2.0 features by reducing the size and number of files. In our case we wanted to optimize wlp.bea.com in order to ensure that users had a pleasant experience when visiting the site. When it comes to JavaScript there are a few things that can be done, including these three:

  • Combine the code to reduce the number of files
  • Minify the code to all unnecessary characters
  • gzip the code to further reduce the size

The last one is largely a function of the server, and you can download the GZip filter from the Dev2Dev > Utilities & Tools > Administration/Management page, use the mod_deflate module for Apache, use HTTP Compression for IIS, and so on. This will give you the largest bang for your buck in terms of the size of the download, but as mentioned earlier, watch out for firewalls that block or otherwise thwart this compression.

Combining the files sounds easy enough, and in many cases you can probably get by using a command-line tool such as the UNIX-style cat or other utilities. The problem is that some JavaScript libraries and applications rely upon the order of loading, and may not work well (if at all) if this isn't taken into consideration. We use the Dojo Toolkit for the Dynamic Visitor Tools and they provide some nice tools for this. I would expect that most toolkits provide something similar, and if not you could use one of the many available.

To see the effect of combining these files, take a look at this diagram:

js_optimization.png

The various bits include:

  • Disc (Dynamic Interface Scripting) - 23 files combined into 2 files
  • PM (Placeable Movement) - 40 files combined into 2 files
  • Bighorn (Look and Feel Skeleton) - 2 files combined into 1 file
  • DVT (Dynamic Visitor Tools) - 42 files combined into 2 files

Going from 107 files to 8 files is fairly dramatic, especially when you consider that many browsers only allow a couple of connections at a time. We haven't measured the impact on the server side, but it's reasonable to assume that it helps out there as well. In case you're wondering, the remaining 2 files are for localization, one from the Dojo toolkit and the other from the DVT. This brings the total to 108 before and 10 after, a better than 10X reduction.

When it comes to minification a good read is JSMIn, The JavaScript Minifier by Douglas Crockford, and his site has other good tips and tools. Most minification tools do one of more of the following:

  • Remove comments
  • Remove whitespace
  • Remove blank lines
  • Remove new-line characters
  • Obfuscation

How much compression is possible via minification will vary wildly, but from our own testing it appears that anywhere from 10-30% is possible. There are examples of minified code on Crockford's site as well as others, and you can try it out on your own code using one of the available online tools. What I haven't measured is the difference between non-minified code vs. minified code when all of it is gzipped, but I do know that gzip doesn't always work and therefore minifying the code is worthwhile in any case.

We'll take a look at doing much the same thing for CSS files next, and we still want to look into further optimizations to reduce dynamic code loading, reduce image sizes, and so on. We're also interested in hearing about other tips, techniques, and tools, so please feel free to post a comment or email us.

Wednesday, March 12, 2008

Web 2.0 Performance Optimization - Testing, Measurement, and Scoring

In the first entry in this series Web 2.0 Performance Optimization - Get an A on YSlow! I included a few screenshots of before/after results from Firebug and YSlow. These aren't the only tools available, but they are among the best at the moment, and are invaluable for diagnosing problems in your web applications, even those you may not be aware of.

How do you use these tools? They're actually fairly easy to use, with straightforward user interfaces that integrate nicely into Mozilla Firefox. There is something called Firebug Lite that works in other browsers, including IE, but this is really just a console and not the same thing at all. There are some tools for IE that I'll discuss, and it looks like Microsoft has copied the Firebug features in the IE8 Developer Tools. These tools are only going to get better, and I believe we're just starting to scratch the surface in this area.

So how can you use them? Let's take a look at Firebug, selecting the Net tab and All subtab and viewing the results for a non-authenticated user on the Welcome page on the optimized wlp.bea.com site. Note that I've cleared all of the private data (cache, cookies, etc.) in order to see what the raw performance of the site would be for a first-time user:

firebug_optimized_welcome.png

As you might guess, clicking on the various tabs for JS, CSS, and so on will show you the results for only those object types. When there are only 20 or so files total as in the optimized site this isn't really a big deal, but when you look at the results for the non-optimized site the value becomes apparent:

firebug_original_welcome.png

If you are surprised that the non-optimized site took 25.24s to finish loading, I can promise you that we were even more surprised by it. The funny thing (and I don't mean that in the good sense) is that one of the biggest problems turned out to be the Symantec/Norton Client Firewall, which is installed and active on most BEA employee machines. This is not meant as a knock on Symantec or their products, but this firewall has a very annoying feature that causes issues with gzipped files, JavaScript files in particular.

This firewall is acting as a proxy for any internet connections and is decompressing the gzipped files before sending them to the browser. The good news is that the gzipped files are being sent from the server to the client, reducing the network bandwidth. The bad news is that it appears to be scanning the files after decompressing them, adding some overhead. This gets far worse when there are lots of JavaScript files as it appears that the firewall is processing them sequentially, limiting the connections to the server, or both. The even worse news is that there doesn't appear to be a way to disable this "feature" for known and trusted sites, although I remain hopeful that they'll enhance it at some point. The only workaround at the moment is to disable the firewall entirely, not something that everyone will be comfortable with.

Here are the results for the non-optimized site with the firewall disabled:

firebug_original_nofirewall_welcome.png

We still have the same large number of requests, but the size of the download has decreased by more than 3X and the page rendering time is 4X faster. That's not bad, but we're still far from what we'd like. Over the next few blog entries we'll see how we can optimize the site to have 4X fewer files, 1/2 the download size, and render the pages over 4X faster.

Now let's take a look at YSlow, which is thoughtfully integrated into Firebug. Here are the performance results for the original non-optimized site, with the firewall enabled:

yslow_original_welcome.png

Seeing the big fat F was depressing, although after looking around at various well-known Web 2.0 sites I knew that many of them weren't much better, and even the best of them still got a C. As a long time BEA employee and a true believer in performance and scalability, I was not going to rest until I got a C or better, and an A was always the target. Here are the results today for the optimized site:

yslow_optimized_welcome.png

As I stated in the first entry, using a CDN would easily get us that A. Disabling localization would also do it, but that's not very friendly, and I have some ideas on how to provide localization while reducing the file request count even further. The cool thing is that we're now looking into ways to provide some or all of these optimizations automagically in the server, using the PRODUCTION_MODE flag to toggle them on or off. I can't promise these for the WebLogic Portal Sunshine release currently scheduled for later this year, but we're going to try.

If you've read all of this and are wondering what the effects of disabling the firewall are on the original site, here they are:

yslow_original_nofirewall_welcome.png

In Tommy Boy the late, great Chris Farley shouted out "D+?... Oh, my God... I passed! I passed! Oh, man! " A 60 isn't a D+, and it's nothing really to shout about, but hey, I liked the movie and the quote, and it's somewhat fitting. I won't name or shame them, but there are plenty of interesting and well-known sites out there that are even worse off, and we haven't even started to optimize our site yet.

You may recall that I listed Fiddler in the earlier blog entry, an add-on for IE. It's not really the same thing as Firebug, but it can provide similar information about the files, download sizes, times, etc. In some ways it offers even more information, but quite frankly it doesn't feel as seamless as Firebug. I would suggest having it in your arsenal and using it, but personally I tend to develop on Firefox first using the various tools it offers and porting to other browsers.

In the next entry we'll start looking into how to reduce the number of files, as well as their size. The nice thing is that there is a huge improvement to be made by simply enabling production mode on the server, and as I said earlier in this entry we'll be looking at how to extend this even further in the future.

Monday, March 10, 2008

Web 2.0 Performance Optimization - Get an A on YSlow!

Okay, first off I have to be honest and admit that we only managed to get a high B on YSlow, with a score of 86. We weren't able to use a CDN, so that category dragged down the overall score. A CDN would easily get us to an A, and a relatively high one at that.

Some of you read that first paragraph and it made sense, others of you may be wondering at some of the jargon used. In this series of blogs I'll be covering a variety of technologies and techniques, and I'll try to clearly define them. If something isn't making sense, Googling a bit can help, but if you post a comment I'll post a reply trying to clear things up. Here are a few that might be helpful when reading these entries:

  • BEA WLP (WebLogic Portal) 10.2- A J2EE (okay "Java EE", sheesh) -based enterprise portal
  • DVT (Dynamic Visitor Tools) Sample - A Web 2.0-based front-end for WLP apps
  • Dojo 1.0.x - An Ajax toolkit that is used in WLP 10.2 for the DVT sample
  • Apache HTTP Server - Popular web server
  • gzip (GNU Zip) - compression utility
  • PNG (Portable Network Graphics) - Lossless image compression format
  • Firebug - Add-on for Firefox that provides diagnostic information
  • YSlow - Add-on for Firefox that works with Firebug to provide site optimization information
  • Fiddler - Add-on for Internet Explorer that provides diagnostic information
  • CDN (Content Delivery Network) - A system for distributing content across a network

There are a number of other tools and utilities that I'll discuss, and I'll try to provide links to them where possible.

When we first went live with the WebLogic Portal 10.2 Playground we asked a number of people to try it out and provide feedback on the performance. Some of the reports were glowing, with users saying that it was very fast, while others reported extremely slow page load times. We began by looking at the server-side configurations, tweaking and tuning the WebLogic and Apache servers, which helped a bit. After a while it became obvious that some machines were being overwhelmed by the number of requests, and the size of the responses weren't helping either. No amount of caching or other server tuning was going to solve all of the issues that we were seeing, so we set off on our quest for speed.

We'll cover the details of this quest in the next few blog entries, but for now let's look at some before and after screenshots. These were captured using Firebug and YSlow for Firefox, with caching enabled but after a recent flush. This is after logging into the site to ensure that the maximum number of JavaScript files, images, etc. are shown. Note that the results have been trimmed a bit, with only the top and bottom shown.

Here are the Firebug results for the original site:

firebug_laptop_original_login.png

Compare this to the results for the optimized site, with far fewer requests, far less downloaded to the client, and a much faster response time:

firebug_laptop_optimized_login.png

YSlow, a great tool from the Yahoo! Developer Network, provides grades for website performance, suggestions on how to speed things up. Here is the original site:

yslow_laptop_original_login.png

Here are the YSlow results for the optimized site:

yslow_laptop_optimized_login.png

As you can see we've managed to improve everything quite a bit, with only the use of a CDN keeping us from getting that A. We would also get a B for the Make fewer HTTP requests if we turned of localization, or if we interned the localized strings, and we could probably get to an A on that if we combined a few of the remaining JavaScript files. At this point we're happy with the progress, especially when the optimized results are compared with the original. To summarize, here are the improvements:

  • Number of requests - 135 - 26 = 109 fewer requests
  • Size of responses - 1064k - 171k = 893k smaller
  • Total response time - 42.15s - 1.86s = 40.29s faster

Once the browser caches are primed the original site will get faster, but in the best cases it is still 3 or more times slower, and it can be much slower in certain circumstances. We can be thankful that the web, the browsers, the servers, and more are reasonably efficient, but there is no magical solution. Or as I like to say, even BEA hasn't figure out how to use quantum physics, wormholes, or magic to get around the laws of nature. What we can do is try to understand the problem and take the steps to create solutions, and thankfully none of it is truly rocket surgery.

Look for follow-ups on this over the next couple of weeks, with plenty of details. If you have any questions or suggestions, I'd love to hear them.

Sunday, March 2, 2008

Play and REST on wlp.bea.com

If you've read my previous entries or otherwise heard about BEA WebLogic Portal 10.2, you probably know that we have some interesting new features to support Rich Internet Applications and Web 2.0. If you haven't, or would like a refresher, here are some background materials you might want to take a look at:

Now it's time to have some fun and play with these technologies directly. I've added a few new portlets to the WebLogic Portal Playground that you can start using today. If you haven't already done so, go to the site and login and/or create a new user account, which is quick and easy. Once you're there, go to the Exploration page and you'll see these portlets:

Try It!

View It!

Using the Try It! Portlet

This portlet will let you try out the client-side Disc and REST features directly. The user interface should be fairly straightforward, but here is a key:

You can select a code template using the Template list:

tryit_templates_01.png

This will change the code displayed in the code text area:

tryit_code_01.png

The buttons on the toolbar allow you to Try It! the code, Copy to Clipboard (IE only), and Clear Output

tryit_buttons_01.png

Many of the code templates have related documentation that can be accessed via the Doc Links list:

tryit_doclinks_01.png

Try some of these out for yourself. For example, select the List Look and Feels template and press the Try It! button to see a list like this:<./p>

If there is any output from the code it will be displayed in the lower text area:

tryit_output_01.png

The code for this template looks like this:

    var appContext = bea.wlp.disc.context.Application.getInstance();
var xmlHttpReq = new bea.wlp.disc.io.XMLHttpRequest;
var url = "/" + appContext.getWebAppName() + "/bea/wlp/api/lookandfeel/list";
var params = "";
params += "?portal=" + appContext.getPortalPath();
params += "&desktop=" + appContext.getDesktopPath();
params += "&webapp=" + appContext.getWebAppName();
params += "&scope=visitor";
params += "&format=json";
url += params;
xmlHttpReq.open("GET", url, true);
xmlHttpReq.setRequestHeader("Content-Type", "application/x-www-form-urlencoded");

function handler(xmlHttpReq) {
if (xmlHttpReq.readyState == 4) {
if (xmlHttpReq.status == "200") {
var result = eval("(" + xmlHttpReq.responseText + ")");
var content = result.content;
var lookAndFeelDetails = content.lookandfeels;
var lookAndFeelDetail = null;
output("Displaying title: markup_name");
for (var i = 0; i < lookAndFeelDetails.length; i++) {
lookAndFeelDetail = lookAndFeelDetails[i];
output(lookAndFeelDetail.title + ": " + lookAndFeelDetail.markup_name);
}
} else {
output("Unable to retrieve look and feels.");
output("Server response:\n\"" + xmlHttpReq.responseText + "\"");
}
}
}

xmlHttpReq.onreadystatechange = function () {handler(xmlHttpReq);};
xmlHttpReq.send(null);


Note: The output function is a special-case provided for this sample and will direct output to the lower text area. If you are using Firebug or another console, you may wish to use that for the output. Similarly, the codearea variable used in some of the samples is a special-case used to represent the text area element for the code, which is useful for finding the portlet context, etc. using Disc.



You can modify the text in any of the templates and try it out yourself. The doc links for each of the templates are a great way to learn what is possible, and you'll find that you can do quite a lot. If you come up with some interesting templates that you think others might benefit from, send them my way and I'll add them along with a comment giving you credit.



Using the View It! Portlet


This portlet provides a tree-style view into the portal context objects provided by Disc. It builds the tree by iterating over the Disc objects, showing the type and either the title or the markup name. When an object is selected in the tree it will update the property sheet and attempt to highlight that object. The properties all come from Disc, and will give you an idea of what is available using that API. Simply add get to the attribute name and you will have the name of the function for that object. For example, if you click on a portlet you will see attributes such as:


viewit_props_01.png

You can use these in the Try It! portlet with code such as:



var portlet = bea.wlp.disc.context.Portlet.findByElement(codearea);
output("Label: " + portlet.getLabel());
output("Title: " + portlet.getTitle());
output("Page Title: " + portlet.getParentPage().getTitle());


Notice that there are functions for getMarkupElement and getContentMarkupElement, which are for the entire portlet and just the contents, respectively. Depending on the selected look and feel you can use these to change the portlet's style dynamically. For example, change the background color of the content area with the following:



var portlet = bea.wlp.disc.context.Portlet.findByElement(codearea);
portlet.getContentMarkupElement().firstChild.style.background = "green";


Note that you need to get the firstChild of the context object as the context object itself is a container.





You can dynamically change the portlet title with code such as:



var portlet = bea.wlp.disc.context.Portlet.findByElement(codearea);
var titlebar = portlet.getTitlebar().getMarkupElement();
var titleElement = titlebar.firstChild.firstChild;
output("Titlebar innerHTML: " + titleElement.innerHTML);
titleElement.innerHTML = "My Portlet";


Note that this won't change the portlet title permanently, but you can combine this with the REST command for updating the portlet if you'd like to do so. Check out the Update Portlet Title template to see the REST command in action, and where you could insert the code above to make this completely dynamic. This is how the DVT (Dynamic Visitor Tools) sample works, and it should demonstrate just how easy using Disc and REST can be.



I'm hoping to replace the simple text area-based editor in the sample with something better soon, and may provide an Upload Your Code feature as well. If you have other ideas for features, templates, etc., I'd love to hear from you.

 
Clicky Web Analytics