Child pages
  • SoftChalk Devel Notes - 2009
Skip to end of metadata
Go to start of metadata

Jan. 12, 2009

Wrote up the basic Sakai CourseFeed documentation. It's pretty rough in places and needs more detail. The whole account activation process is kludge, IMO. It would be far better to do this in the CourseFeed gateway servlet as a REST request. One problem (besides some remaining technical issues) is that virtually anyone will be able to do the Sakai/Facebook link. There is no control by student or group of students. However, if I can get the gateway approach to work, we can configure this to work both ways. See notes on Nov. 24, 2008. Also note:

	The call to CourseFeedService.registerUser() uses the real userId, rather than the eid.  
	I'd need to extract the EID from the session to do it this way.  Not worth it, since 
	I've already told ClassTop to access the CourseFeed tool using the GetToolId Rest request.

The activation request is of the form:

 
	http://localhost:8080/cfgateway?action=IntegrateFacebook&fbId=636389812
	&hs=SQX%5b%00%0b%17RY
	&success=http%3a%2f%2fapps.facebook.com%2fcoursefeed%2f%3fdisplay%3dbbconfirmed%26lmsServerId%3d123
	&failed=http%3a%2f%2fapps.facebook.com%2fcoursefeed%2f%3fdisplay%3dbbconfirmed%26lmsServerId%3d123
 

This command needs to be issued after the user has logged into Sakai (and has an active Sakai session). There may be some collision between the coursefeedgateway session and a student's session.

The return should be an HTTP re-direct command based on the outcome of registration.

Created student02. Logged in and ran the above URL. Table results are:


 +--------------------------------------+---------------+
 | username                             | facebook      |
 +--------------------------------------+---------------+
 | ea1fe76e-84ff-4aef-b2c7-aeeb2e3a50d9 | 636389812     |
 | betty                                | 745478915     |
 | mark                                 | 99999999      |
 | test                                 | [facebook-id] |
 +--------------------------------------+---------------+

The first entry is student02 using the real ID instead of the EID. Need to fix to use EID.
Added code to get EID from User object. Tested against student02 and now works:

mysql> select * from classtop_access;
+-----------+---------------+
| username  | facebook      |
+-----------+---------------+
| student02 | 636389812     |
| betty     | 745478915     |
| mark      | 99999999      |
| test      | [facebook-id] |
+-----------+---------------+
4 rows in set (0.00 sec)

Todo:

  • Database
    • Oracle support.
  • CourseFeed Application
    • Re-direct success and fail URLs in Sakai CourseFeed application.
    • Remove extra links from Sakai CourseFeed application. [done]
  • GetCourseList - wantAll flag
    • Check on affects of SU on other functions. Still need to log in?
  • GetUserInfo
    • Check SakaiPerson for additional personal information.
  • IntegrateFacebook action
    • Add a system configuration switch. [done]
  • Login
    • Write to the JSESSIONID cookie. [done]
  • Documentation
    • Better install documention. [done]
    • Include alternative activation and sakai property. [done]

Jan. 13, 2009
-------------

Added code to doRestIntegrateFacebook() to take pass/fail URLs as parameters and redirect to the appropriate URL on exit.
Updated documentation in CourseFeedServlet.

Added the following system configuration strings to sakai.properties:

# ClassTop CourseFeed registration setting can be either:  servlet or workspace.
classtop.coursefeed.registration=servlet
# ClassTop CourseFeed unregister setting displays an unregister link if true.
classtop.coursefeed.unregister=true

Added code to servlet to check for registration setting. Added code to MainProducer to check for unregister setting.

Use this link to test since success/failed is now required:

	http://localhost:8080/cfgateway?action=IntegrateFacebook&fbId=636389812&success=http://www.nolaria.com&failed=http://www.nolaria.org

Configurations work. Installed new version of Sakai CourseFeed on demo71. Access it via:

	http://demo71.classtop.com:8090/portal/

Todo:

  • Database
    • Oracle support.
  • CourseFeed Application
    • Re-direct success and fail URLs in Sakai CourseFeed application.
      *GetCourseList - wantAll flag
    • Check on affects of SU on other functions. Still need to log in?
  • GetUserInfo
    • Check SakaiPerson for additional personal information.
  • IntegrateFacebook action
  • Login
  • Documentation
    • Finish documentation.
    • More detail in REST requests.

May 18, 2009
------------

9:00am to 11:am

Created empty servlet. Adjusted servlet and tool structures and names.
Servlet is stubbed with a hello world message.

Got stubbed version working.

The REST interface developed by Jeff K. uses POST requests of the form:

http://publish.softchalk.com/myservice/getFolders.jsp?license=12345&accessmysCode=6789&contentId=123

Note that a different JSP is used to handle each command. We propose to alter this slightly to:

http://localhost:8080/lbgateway/getFolders?license=12345&accessCode=6789&contentId=123

In this case, the lbgateway application (the Sakai REST hanlder for SoftChalk LessonBuidler) has a command as part of the URL before the CGI parameters. Jeff tells me that all requests are POST, in spite of using GET parameter passing. All current requests pass a license and accessCode. I propose to change this to drop these and replace them with session, which is a session Id gathered after loging. We could stuff this into a cookie, if necessary, like Sakai currently does.

2:30pm to 4:30pm
Worked on debugging support, error result reporting, etc.

May 19, 2009
------------

10:00am to 12:00nn

Added a stub to include calller IP address in error messages. Escapted XML part of error result.
Created extractCommand() to extract the command node from the request URL.

(Note: my servlet documentation on laptop doesn't include Servlet Javadoc).

Started work on the login command, who's request is of the form:

http://localhost:8080/lbgateway/login?name=admin&pw=admin

Worked the first time (I love that) with the following result:

<response>
<callerIP>127.0.0.1</callerIP>
<user>admin</user>
<sessionId>235213a8-d68f-492f-ac7d-aeca636de305</sessionId>
</response>

Also checked for failure by supplying an invalid password:

<response>
<error>LoginFailure</error>
<callerIP>127.0.0.1</callerIP>
</response>

The inclusion of a sessionId is a new parameter with respect to the SoftChalk REST interface, but it's required by Sakai. It needs to be passed in subsequent requests (though the session ID is also set in JSESSIONID cookie. Need to talk to Jeff about this. It might be simpliest to just check the cookie. That way, if the user logs into Sakai and then tries to make a REST request, it will still work. Not sure how SoftChalk would detect this and know to present a login dialog box, though. It might be best to just require users to login to Sakai then either rely on the cookie or explicitly pass it as a session id.

May 23, 2009
------------

9:30am to 10:30am - in flight, Atlanta to Newark

Added code to extract the session id from cookie. Code seems to work ok.
Added getProductVersion support. Added "sakai.version" to sakai.properties. This isn't the right way to get the Sakai version, but is a stub.
Added a help command that lists supported requests.

11:00am to 12:00nn

http://localhost:8080/lbgateway/getCourses?

Added getCourses command. Has the following results:

<?xml version="1.0" encoding="UTF-8"?>

<response>
<callerIP>127.0.0.1</callerIP>
<course>
<courseid>5ca85894-d498-4410-96fd-9b57c8132c7c</courseid>
<coursetitle>BIO-101-s2</coursetitle>
<contentid>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/</contentid>
<instructor>admin</instructor>
</course>
<course>
<courseid>6f567b70-4a54-4686-b2a7-d3ecb84f5768</courseid>
<coursetitle>ENG-201-s3</coursetitle>
<contentid>/group/6f567b70-4a54-4686-b2a7-d3ecb84f5768/</contentid>
<instructor>admin</instructor>
</course>
</response>

There seems to be a problem with login. Session id may not be written to cookie correctly. Above only works if you log into Sakai as well as the REST handler.

May 27, 2009 - Indianapolis Airport
------------

4:30pm to 5:30pm

Started work on get folders. Designing the code such that this can handled getCourseTOC, getFolders, and getFoldersAndItems.

http://localhost:8080/lbgateway/getFolders?contentId=/group/5ca85894-d498-4410-96fd-9b57c8132c7c/

Initial response, before access to Conent Hosting:

<?xml version="1.0" encoding="UTF-8"?>
<response>
<callerIP>127.0.0.1</callerIP>
<contents>
<folders>
<folder>
<name>[folder name goes here]</name>
<id>[folder id goes here]</id>
</folder>
</folders>
</contents>
</response>

Got collection and properties. Current results (prior to recursive scan):

<?xml version="1.0" encoding="UTF-8"?>
<response>
<callerIP>127.0.0.1</callerIP>
<contents>
<folders>
<folder>
<name>BIO-101-s2</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/</id>
</folder>
</folders>
</contents>
</response>

6:45pm to 7:30pm

Added code to scan for folders. Results are now:

<?xml version="1.0" encoding="UTF-8"?>
<response>
<callerIP>127.0.0.1</callerIP>
<contents>
<folders>
<folder>
<name>BIO-101-s2</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/</id>
</folder>
<folder>
<name>Animals</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Animals/</id>
</folder>
<folder>
<name>Flowers</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Flowers/</id>
</folder>
<folder>
<name>Tulips</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Flowers/Tulips/</id>
</folder>
</folders>
</contents>
</response>

isAvailable and isEmtpy attrbitutes on folder elements are not included at this time.

10:00pm to 10:30pm - in flight to Ithaca.

Added isAvailable and isEmpty:

<?xml version="1.0" encoding="UTF-8"?>
<response>
<callerIP>127.0.0.1</callerIP>
<contents>
<folders>
<folder isAvailable="true" isEmpty="true">
<name>BIO-101-s2</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/</id>
</folder>
<folder isAvailable="true" isEmpty="true">
<name>Animals</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Animals/</id>
</folder>
<folder isAvailable="true" isEmpty="true">
<name>Flowers</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Flowers/</id>
</folder>
<folder isAvailable="true" isEmpty="true">
<name>Tulips</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Flowers/Tulips/</id>
</folder>
<folder isAvailable="true" isEmpty="false">
<name>Fishes</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Fishes/</id>
</folder>
</folders>
</contents>
</response>

Note that the Fishes folder is empty.

http://localhost:8080/lbgateway/getCourseTOC?courseId=5ca85894-d498-4410-96fd-9b57c8132c7c

Added code to get course TOC, which is very similar to get folders, but takes a course (site) id instead of a collection id.

Getting an unknown id error. Not the right site id?

May 30, 2009
------------

As I suspected, I was using the wrong site id. Corrected URL above. No prefix is needed to the site reference. Results of getCourseTOC is:

<?xml version="1.0" encoding="UTF-8"?>

<response>
<callerIP>127.0.0.1</callerIP>
<coursetoc>
<folders>
<folder isAvailable="true" isEmpty="true">
<name>BIO-101-s2</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/</id>
</folder>
<folder isAvailable="true" isEmpty="true">
<name>Animals</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Animals/</id>
</folder>
<folder isAvailable="true" isEmpty="true">
<name>Flowers</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Flowers/</id>
</folder>
<folder isAvailable="true" isEmpty="true">
<name>Tulips</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Flowers/Tulips/</id>
</folder>
<folder isAvailable="true" isEmpty="false">
<name>Fishes</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Fishes/</id>
</folder>
</folders>
</coursetoc>
</response>

Note that the only difference between this an get folders it the <content> container vs. <coursetoc>. This is due to the fact that the content structures are the same in Sakai, whereas in BlackBoard, they are likely to be difference.

Added code to getItems.

http://localhost:8080/lbgateway/getItems?contentId=/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Animals/

Current results are:

<?xml version="1.0" encoding="UTF-8"?>
<response>
<callerIP>
127.0.0.1</callerIP>
<contents>
<items>
<item isAvailable="true">
<name>Zoology.com</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Animals/http:__www.zoology.com</id>
<type>text/url</type>
<contenthandler>[handler goes here]</contenthandler>
</item>
<item isAvailable="true">
<name>Flowers Web page</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Animals/Flowers Web page</id>
<type>text/html</type>
<contenthandler>[handler goes here]</contenthandler>
</item>
</items>
</contents>
</response>

This example uses a contentid for the Animals folder. Currently, the code only reports on items in a specified collection folder. It does NOT recursively scan for sub-folders. I suspect that this is the implied behavior, but the documentation doesn't give a lot of evidence for it. I'm going to have to send email to Jeff Kahn and get an interpretation.

June 5, 2009
------------

Contracted with DailyRazor.com to host softchalk-sakai.com with the intent of creating a Sakai instance for development and demonstration purposes. After it was set up, I started running into problems. Seems that they will not allow Maven to run on the server. They claim it uses background processes that violate the terms of the user agreement. News to me, but they seemed pretty firm on it.

Checked out sakai_2-5-4

Closder investigation indicates that they are using a non-standard Tomcat directory structure. The webapp directory is moved to /home/mnorton/public_html. This is done, in part, to make the public web page space be shared with WARs. I think there is an app they've added to allow simple web pages to be displayed if dropped into this space, confirmed by adding an "index.html" page. Visible on the internet as, "http://softwchalk-sakai/index.html".

June 8, 2009
------------

This Sakai build will have to be handled carefully. I've downloaded the tomcat environment from the hosted server. They are running version 5.5.20, which is a bit older than where Sakai is, but it should be ok (I hope). Rather than messing with DailyRazor's non-standarded directory structure, I'm going to restore the regular structure by adding the webapp directory back in. This should allow me to build Sakai normally on my local development machine. Then it will need to be deployed (carefully) into the server environment. Once it's up and running on the server, I shouldn't have to do this again until it is time to update Sakai.

The tomcat/bin directory is also missing. God knows where it is.

I'm going to use the setup instructrions from Programmer's Cafe (still the best set of install directions around).

1. Install Java 1.5.
DailyRazor has installed Java 1.5_15.

2. Install MySQL
Attempted to use the front panel setup for MySQL. Tried to create a database called "sakai" and it created one called "mnorton_sakai" Sigh. Sakai requires the database to be called "sakai". Put in a request to support for help.

3. Setup Sakai DB schema
This will have to wait until above is ready.
I am informed that the database must becalled "mnorton_sakai". It's possible I can work around this, since the name of the database is specified in Sakai properties.
Created a user called "mnorton_mnorton" - again, silly conventions.
I was told I needed to add my local IP number to get remote access. Did that via control panel.
Run command "mysql -umnorton_mnorton -pmark6972 mnorton_sakai@softchalk-sakai.com,
but got a permission violation. Put in another support request.

Real login: mysql -ugodwinyj_11 -pA4zmQwgA

4. Download and install Maven.
Well, I got it locally. Running maven 2.0.7.

5. Install Subversion
Locally running svn version 1.3.1.

6. Download and install Tomcat.
Copied tomcat installation down from server to local computer.
This is version 5.5.20.
Changed CATALINA_HOME to "c:/dev/tomcat"

7. Download and install MySQL DB Connector
Copied connector from older tomcat install to tomcat/common/lib.
I note that there are no other JARs in this directory yet. Some may be added during the first build.

8. Use SVN to download sakai sources
Done. Downloaded sakai_2-5-4 from source.sakaiproject.org/svn/sakai/tags/sakai_2-5-4

9. Set up sakai.properties.
Copied existing sakai.properties file from an older installation into new one.
This includes both the tomcat/sakai directory and the sakai.propeties file in it.
This file is pre-configured to use MySQL database (instead of Oracle).

10. Create (update) Maven settings
Edited ../Mark/.m2/settings.xml.
appserver.home -> c:/dev/tomcat
maven.tomcat.home -> c:/dev/tomcat
sakai.appserver.hom -> c:/dev/tomcat
These changes allow maven to deploy build to correct tomcat directories.

11. Use maven to build sakai.
first "mvn clean install"
then "mvn sakai:deploy"
Sakai built without error.

June 9, 2009
------------

Daily Razor recommended creating a user via cPanel and adding permissions that way. I created a user called "sakai", which appears as "mnorton_sakai" and gave it all permissions. This user has a password of "ironchef". I'm not sure this is going to work, but I'll make the changes to the Sakai properties, upload the WARs and see if Sakai will create the correct database tables. I give it a 30% chance of success.

June 10, 2009
-------------

Still have problems connecting to database using mysql shell command.

Checked database names one more time. Confirmed changes in sakai.properties.
Anythony told me that the sakai version is a property called "version.sakai". I was close. Added to properties file.

It's now time to upload the Sakai WARs and JARs. This has to be done carefully since Sakai puts things in various places and Daily Razor has messed with that a bit. Fortunately, the biggest change seems to be where the web apps live (public_html instead of webapps).

FTP is being very slow. This may take quite some time. Took nearly two hours.

Tomcat failed to start. Several errors, chief of which is that it ran out of memory. UTF-8 settings are not correct, either.

Updated Servlet documentation on my laptop. This will make development on the road easier. I also downloaded commons-fileupload-1.2.1 to handle file upload. I've only found 1.1.1 in a maven repo.

Most of the remaining REST commands have to do with uploading one or more files.
Looks like I miss-remembered things a bit. I though I had used the apache commons fileupload package in Sousa to upload files, but I used Spring instead. That's probably the simple route, since Spring is already included in Sakai.

Looks like Maven found commons-fileupload-1.2.1, so I can use that.
Added initial stubs for addItem request.

http://localhost:8080/lbgateway/addItem?courseId=myCourse&contentId=/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Animals/&name=Elephant

Result is:

Course id: myCourse<br/>
Content id: /group/5ca85894-d498-4410-96fd-9b57c8132c7c/Animals/<br/>
Item name: Elephant<br/>

I can probably lose the breaks for now.
Next step is to create a web form using POST to submit the same request.

Added code to detect multipart data with this result:

Course id: MyCourse
Content id: Animals
Item name: Elephant
Multipart content included.

Thrashed around trying to access the uploaded item. The commons upload package is a rat's nest of classes and not very well documented. The following code complies and should work:

try {
ServletRequestContext ctx = new ServletRequestContext(this.req);
ServletFileUpload upload = new ServletFileUpload();
FileItemIterator it = upload.getItemIterator(ctx);
int ct = 0;
while (it.hasNext())

Unknown macro: { FileItemStream fi = (FileItemStream) it.next(); results.append ("/tItem no."+ ct +"}

if (ct == 0)
results.append ("No items were found.\n");
}
catch (Exception ex)

Unknown macro: { results.append ("Exception encountered while trying to parse upload items."); }

Sadly, it throws and exception from the Sakai request filter. Why? I have no idea.
This is the offending line:

FileItemIterator it = upload.getItemIterator(ctx);

Likely something in getItemIterator is calling off to the context causing an exception:

WARN: (2009-06-10 16:30:25,687 http-8080-Processor25_org.sakaiproject.util.RequestFilter)
javax.servlet.ServletException: Servlet execution threw an exception
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:292)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188)

Still struggling with apache commons upload.
The Spring approach might be better.

From UploadItemBean in Sousa:

import org.springframework.web.multipart.commons.CommonsMultipartFile;

Downloaded the Spring API to c:/

June 11, 2009
-------------

Spoke with Sue Evans at SoftChalk. She recommended using softchalk.com.
SSH as godwinyj@qs1908.pair.comm
pw: Raining

I had to setup environment variables by hand using setevn:

CATALINA_HOME to /usr/www/users/godwinyj/dev/apache-tomcat-5.5.26
JAVA_OPTS to (usual)
JAVA_HOME to /usr/local/diablo-jdk1.5.0

started tomcat from shell.
Was able to access tomcat via the internet using:

http://www.softchalk.com:8080/

If I can get maven installed and set up the environment variables to come up automatically, I think we can install Sakai on this machine.

June 12, 2009
-------------

Added environment variables to .cshrc in ~godwinyj.
Installed Maven 2.0.10, added to shell path. Works fine.
Tried to access mysql - bad username/password. Put a request in to get name/pw.
Downloaded sakai from svn to ~godwinyj/mark

Started playing with file upload again via Apache Commons.
I found what I think may the the problem. The Sakai request filter has an init-param for file upload that needs to be set to false in web.xml. - Didn't work, darn it.

I could take out the Sakai request filter, since it seems to be causing problems, but this is what pick up the cookie and restores the user session. If I take this out, then I need to get my cookie detection to work.

Returning to: http://localhost:8080/lbgateway/login?name=admin&pw=admin

and http://localhost:8080/lbgateway/getProductVersion?

These should allow me to test user sessions.

<response>
<callerIP>
127.0.0.1</callerIP>
<user>
admin</user>
<sessionId>
f4bcc6fe-6d40-4be4-84ea-9e87d1162d1f</sessionId>
</response>

        • LessonBuilder - request command is login
        • LessonBuilder - request command is getProductVersion
        • LessonBuilder: session cookie is: javax.servlet.http.Cookie@1d6f830
        • - LessonBuilder - user id is: null

There's a problem. The session id isn't be stored in the cookie correctly. This was a problem in the debug output. Try again:

        • LessonBuilder - request command is login
        • LessonBuilder: Login: Cookie set is: JSESSIONID = 7e00b4e4-79e1-4b25-bd86-fc2608fd69a3.127.0.0.1
        • LessonBuilder - request command is getProductVersion
        • LessonBuilder: session cookie is: ce8c1c4d-28e9-4ae9-beb5-ea4cbf62625f.localhost
        • - LessonBuilder - user id is: null

So the cookie coming back is not the one being sent.
Looking at the cookies set in my browser (FireFox), I see that there are two JSESSIONID's set in localhost.
Deleted them both and re-ran two URL tests:

        • LessonBuilder - request command is login
        • LessonBuilder: Login: Cookie set is: JSESSIONID = 85d7aee9-34e1-4d6a-811f-708301c911e3.127.0.0.1
        • LessonBuilder - request command is getProductVersion
        • LessonBuilder: session cookie is: f3313e8f-ded0-4382-b26e-34b055460dcd.localhost
        • - LessonBuilder - user id is: null

Somebody is writing a second cookie. It might be the Sakai Filter.
You know, there is a simple way around this. I can try writing to my own cookie. Worth a try. We'll try a name of "SOFTCHALKID". Results are now:

        • LessonBuilder - request command is login
        • LessonBuilder: Login: Cookie set is: SOFTCHALKID = d7ea5780-adb8-49a3-b571-2dab7bff873f.127.0.0.1
        • LessonBuilder - request command is getProductVersion
        • LessonBuilder: Cookie found is: SOFTCHALKID = d7ea5780-adb8-49a3-b571-2dab7bff873f.127.0.0.1
        • - LessonBuilder - user id is: admin

This looks MUCH better. Note that the user id is no longer coming back null. This should allow me to maintain a session without logging into Sakai. One side effect is that logging into Sakai won't get you access from LessonBuilder. The LessonBuilder will ALWAYS have to log in and start a session.

Unfortunately, when I tried, http://localhost:8080/lbgateway/getCourses?, I didn't get any courses. Sigh.

Added code to refresh user with AuthzGroups: AuthzGroupService.refreshUser(userId);

No joy. Courses are still not coming back, but when I log into Sakai, they are available. This says that I am not establishing a true user session. Sakai is up and running, so I shouldn't have to invoke the portal to make this happen.

Built a database on softchalk.com using front panel called godwinyj_sakai. User name is 'godwinyj_11' and the password is 'A4zmQwgA', which can be used to log into mysql from a secure shell. Adjusted username, password, and database name in sakai.properties. Made some fixes to .cshrc and started building Sakai using maven.

mysql -ugodinwyj_11 -pA4zmQwgA
connect godwinyj_sakai;

Build failed once. Restarted. Success. Deployed.

June 13, 2009
-------------

When I went to startup Sakai for the first time on softchalk.com, I was startled to find that tomcat/bin was GONE! Gasp! Did they screw up my software over night? What could be the problem????

Simple, my settings were wrong. I put apache-tomcat.5.5.27 into settings.xml instead of 5.5.26, so Maven merrily created a whole tomcat directory structure for the things it needed to deploy. However, Maven wasn't asked to set up tomcat (that was supposed to be done before running maven), so the binaries were not there.

Also simple to fix. Changed CATALINA_HOME to 5.5.26, and the settings.xml file. Now re-running maven to build sakai and deploy it to 5.5.26.

Built without error, and started up without error. Sakai is now running on http://wwww.softchalk.com:8080/portal.

Went back and looked at SakaiScript, specifically the establishSession() method. I wasn't setting the current session. Once set, cookies are now working. Able to login and then see a list of courses without logging into Sakai first. Wheee!!!!

Now I just need to get file upload to work, and finish off the rest of the app.

June 14, 2009 - Flight to LA/Long Beach
-------------

5:00a - 600a Ithaca Airport

Ok, major remaininng problem is file upload. I've tried the Apache Commons Upload twice now, but it doesn't work. I get an exception from the Sakai RequestFilter. Now that I'm managing my own cookie, I shouldn't need the RequestFilter any more. First thing to do is to confirm that it still isn't working (you can't trust software, sometimes it heals itself).

Still fails:

WARN: (2009-06-14 05:28:09,625 http-8080-Processor24_org.sakaiproject.util.RequestFilter)
javax.servlet.ServletException: Servlet execution threw an exception
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:292)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:188)
at org.sakaiproject.util.RequestFilter.doFilter(RequestFilter.java:592)

Before I move on stubbing out the RequestFilter, I'm going to try and catch errors like this one. Heh, so much for that. The error happens outside of my code and can't be caught. That's strange, when you think about it. It means that my code isn't causing an error, but the form submission is. Whatever, putting in the top level try/catch will handled any exceptions that do arise from my code.

Commented out the RequestHandler in web.xml. Now I'm getting a different exception:

java.lang.NoSuchMethodError: org.apache.commons.fileupload.servlet.ServletFileUpload.getItemIterator(Ljavax/servlet/http/HttpServletRequest;)Lorg/apache/commons/fileupload/FileItemIterator;
com.softchalk.lessonbuilder.servlet.LessonBuilderServlet.doAddItem(LessonBuilderServlet.java:664)
com.softchalk.lessonbuilder.servlet.LessonBuilderServlet.handleRequest(LessonBuilderServlet.java:188)
com.softchalk.lessonbuilder.servlet.LessonBuilderServlet.doPost(LessonBuilderServlet.java:123)
javax.servlet.http.HttpServlet.service(HttpServlet.java:647)
javax.servlet.http.HttpServlet.service(HttpServlet.java:729)

This is interesting. The method, ServletFileUpload.getItemIterator() doesn't seem to exist. So much for documentation. I checked the version in the maven pom. Set to v1.2.1, which is the latest as far as I know. I do have the sources with me, so I can look there, but it's strange that it let me compile the code only to fail at runtime. It seems to suggest that the interface has the method, but it isn't implemented. That or I'm picking up the wrong version somehow.

9:00 - 10:00 Flight to Detroit

Recoded the simple case to use parseRequest().

java.lang.NullPointerException
at org.apache.commons.fileupload.FileUploadBase.createItem(FileUploadBase.java:500)
at org.apache.commons.fileupload.FileUploadBase.parseRequest(FileUploadBase.java:367)
at org.apache.commons.fileupload.servlet.ServletFileUpload.parseRequest(ServletFileUpload.java:116)
at com.softchalk.lessonbuilder.servlet.LessonBuilderServlet.doAddItem(LessonBuilderServlet.java:644)

Switched back over to the straight fetch of input stream data from the HttpServletRequest. Now it seems to be working. I think dropping the RequestFilter may have allowed this to start working. Data is coming back as digits. I need to convert them to characters to see if the data is right.

Ok, reading input stream gives us this:

Data from input stream: -----------------------------265001916915724
Content-Disposition: form-data; name="courseId"

MyCourse
-----------------------------265001916915724
Content-Disposition: form-data; name="contentId"

Animals
-----------------------------265001916915724
Content-Disposition: form-data; name="name"

Description
-----------------------------265001916915724
Content-Disposition: form-data; name="fileitem"; filename="test.txt"
Content-Type: text/plain

This is a text messasge used to test file upload.

----------------------------265001916915724-

11:00a - 2:00p Flight to LA

Each of the form fields are contained in this output. FileUpload.parseRequest() is supposed to break this up and give it back to be as a series of DiskFileItems. Before beating my head against that wall again, I'm going to have a look at a simpler example by creatinng a form with only a file input field and parameters encoded in the URL.

This time the results are:

Course id: MyCourse
Content id: Animals
Item name: null
Multipart content included.
Data from input stream: -----------------------------114782935826962
Content-Disposition: form-data; name="fileitem"; filename="test.txt"
Content-Type: text/plain

This is a text messasge used to test file upload.

----------------------------114782935826962-

This is much closer to what we need, but I'd still need to strip off the field headers and boundary markers.
I wonder why Jeff's example doesn't suffer from this problem?

Both of these forms have a stream that matches what FileUpload should be able to handle.

Venturing a guess, I'd say that either Jeff's example does conform to RFC-1867 or it should be made to conform.

Switched back to simple apache approach. It failed again with a null pointer. I note that the exception refers to createItem(), though the line number is wrong. If we look at createItem(), we see:

protected FileItem createItem(Map /* String, String */ headers,
boolean isFormField)
throws FileUploadException

Unknown macro: { return getFileItemFactory().createItem(getFieldName(headers), getHeader(headers, CONTENT_TYPE), isFormField, getFileName(headers)); }

There's not a lot to go wrong there, except for one thing: if getFileItemFactory() returns null, this will throw a NullPointerException. Since that's the error I'm getting and I don't user a factory, I'm going to assume that this is the problem So, we'll just add a factory and see what happens.

Be still my beating heart, it worked. Results are:

Course id: MyCourse
Content id: Animals
Item name: Description
Multipart content included.
/tItem no.0: test.txt

I'd much rather use the Apache File Upload, than parseing raw input as Jeff's example does. For the record, here's the results from the full form:

Course id: null
Content id: null
Item name: null
Multipart content included.
/tItem no.0: null
/tItem no.1: null
/tItem no.2: null
/tItem no.3: test.txt

Since parameters are going to be URL-encoded, I don't really need the full form to work. Just let me upload a simple file, please.

So the simple Apache upload approach works based on parseRequest(), but that gives me a List of FileItem objects. FileItmes want to treat items as text. What I REALLY need is a list of StreamFileItems, which is what ServletFileUpload.getItemIterator() is supposed to do for me. Reading ServletFileUpload closely, I see the following in the default constructor:

/**

  • Constructs an uninitialised instance of this class. A factory must be
  • configured, using <code>setFileItemFactory()</code>, before attempting
  • to parse requests.
    *
  • @see FileUpload#FileUpload(FileItemFactory)
    */

This explains why my attempt to getItemIterator() failed as it did, there was no factory. Sadly, the package doesn't seem to provide a "StreamItemFactory". Furthermore FileItemStream is an interface and I can't find an implementation of it. One almost has the impression that this software package is incomplete, but that's VERY unlikely coming from Apache.

Ok, one more try. I initialized ServletFileUpload with DiskItemFactory in the stream case. Let's see if it somehow, magically, returns an interator of FileItemStream objects. It gives me the "no such method" error before, which is strange because getFileItemIterator is clearly in the code. I strongly suspect that maven didn't pull down the version of the code that I have.

Clearly, this is going to need more work when I have internet connectivity (which I don't currently at 10,000 feet).
So, I'm going to break out getting the file into it's own method and leave it set to using the FileItem approach. This will limit me to uploading only text files currently, but I can live with that until we can figure out how to do binary files.

ContentResourceEdit wants a InputStream. I can provide that easily enough from a text stream.

In the process of breaking out UploadedFile, I noticed that FileItem has a getInputStream() method. Doh! This likely does everything I need it to. I think I might move it back into the main servlet.

5:30p - 6:30p Long Beach Hilton

Finished eliminating compile errors. UploadedFile.java is still there, but not being used. I'm keeping it around in case I need the old code. Meanwhile, fixed up doAddItem() to use parseRequest(). Assumes only one single field, which is a file. Results are:

Course id: MyCourse
Content id: Animals
Item name: Description
Multipart content included.
Uploaded File Name: test.txt
Text File Content: This is a text messasge used to test file upload.

Mostly coded. Need to add properties. Will also need a file extension to mime type map. I'm getting kinda tired of writing that. I really wish Sakai had an API for that.

7:30p - 8:30p Long Beach Hilton

Added properties for name (could do description, too, if I had it).

http://localhost:8080/lbgateway/addItem?courseId=myCourse&contentId=/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Test/&name=TestFile

Some success. Resource entry was created with correct name. Size of document seems right, but I cannot access the content.
Created a "Test" folder. Will tag id with a timestamp to make them unique.

Created item in the correct folder. Also, when I clicked on it in the Resource tool, it offered to download it. This is good. I think it's working.

Some things to do:

1. Revert debug mode to being based on parameter flag. Make XML mode the default.
2. Create the response text for addItem.

Added response text for addItem:

<response>
<callerIP>127.0.0.1</callerIP>
<newContentId>
/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Test/test.txt-1245027225187
</newContentId>
</response>

Made XML mode the default return display.

June 16, 2009
-------------

Got Jeff's Sakai instance and REST app working.

June 17, 2009
-------------

12:00n - 13:30 - LAX

A bit of research reveals that URL encoding can be done as follows:

import java.net.URLEncoder:
String encoded = URLEncoder.encode ("string", "UTF-8");

and decoded using
String decoded = URLDecoder.decode ("string", "UTF-8");

The following data elements (all of them) should be encoded using the above:

GetCourses
Instructor
Titles
GetCourseTOC
Name
GetFolders
Name
GetItems
Name
GetFoldersAndItems
Name

Tried to this the right way (with UTF-8 encoding), but that could throw an encoding exception, which is a pain in the ass. So I used the deprecated method instead, which doesn't pass an encoding type. I'm fairly confident that it will handling it ok anyways. Tested - seems to work, though I don't have any strings that would force encoding.

Added code for new folder. Tested, works. Results are:

<response>
<callerIP>127.0.0.1</callerIP>
<newContentId>
/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Test/TestFolder-1245258423328/
</newContentId>
</response>

8:00p - 9:00p - Philadephia Airport

Added checks for addZipItem, addSCORMItem, and addSCORM2004Item. All of these call doAddItem() and currently ignore the other parameters that might be submitted. Some of these have to do with grade book integration (not currently supported by Sakai SCORM delivery). Others are SCORM control parameters that needt to be investigated.

June 18, 2009
-------------

Back in Ithaca (finally!)
Started work on replacing items. Jeff's documentation provides a content id, name, and the replacement content. Presumably, the name could change. IM'ed with Jeff. Name doesn't have to change. Just replace the content.

Code added to replace an item.

http://localhost:8080/lbgateway/addItem?&contentId=/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Test/test.txt-1245027225187

Results are:

<response>
<callerIP>127.0.0.1</callerIP>
<success/>
</response>

Content of that item is now:

Replacement text message.

This is now working.

June 22, 2009
-------------

Tagged lbgateway as version 0.2. Zipped it up and installed it on SoftChalk.com.
Sources are in ~godwinyj/mark/sakai-2.5.4.
Version numbers in two poms need to be changed at the root of softchalk, and softchalk/lb-gateway/tool

June 23, 2009
-------------

Added a linear listing of collections in a folder as alternative to recursive decent. Both methods are still present in code, which allows us the option of recursive scanning if desired.

Tested, working.

Fixed bug in isEmpty attribute - sense of test was reversed.

Jeff refined the parameters of getCourseTOC to take only the course id (not contentId). getFolders will take only a contentId.
Looking at getCourseTOC, it seems to already work that way.

Discovered the ContentTypeImageService in the Sakai CHS. This allows me to get the MIME type based on extensions. Wish I knew about that back when I was working on Sousa. Added the code and tested it on a GIF image. Seems to work ok.

After discussing it with Jeff, we agreed that content handler is not needed in content items. Commented out the code.

Expand will be "true" or "false".

Implemented delete command.

June 24, 2009
-------------

Jeff mentioned problems with v0.4 in that XML was not being returned from a call. Rather, the HTML debug wrapping was on for some reason. I don't really see much use for this at this time, so I'm going to strip it out.

Released as 0.5.

June 25, 2009
-------------

Implemented getItemsAndFolders. Results are as:

<response>
<callerIP>127.0.0.1</callerIP>
<contents>
<folders>
<folder isAvailable="true" isEmpty="false">
<name>Mammals</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Animals//</id>
</folder>
</folders>
<items>
<item isAvailable="true">
<name>Zoology.com</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Animals/http:__www.zoology.com</id>
<type>text/url</type>
</item>
<item isAvailable="true">
<name>Flowers+Web+page</name>
<id>/group/5ca85894-d498-4410-96fd-9b57c8132c7c/Animals/Flowers Web page</id>
<type>text/html</type>
</item>
</items>
</contents>
</response>

Fixed bug resulting in Null Pointer when no logged in.

Fixed bug in delete leading to IdUnusedError.

Fixed bug in file upload where type was always 'text/text'

June 26, 2009
-------------

Coded up getFile. I get a null pointer error after getting the servlet configuration. It's coming back null from the servlet. Searching the web seems to point to a problem in the web.xml file. I'm not using the name servlet-name as the URL spec. Changed to "lbgateway", but that didn't work.
I wasn't calling super.init(config) in the init() method. That fixed the null poinnter error, but now I'm getting an illegal state exception indicating that I've already called the getWrite() method. Problem was returning error text from the exception catch on the IO. It's too late to send back XML at that point, you can only punt. Fixed now.

June 30, 2009
-------------

Resumed work on trying to get SCORM to be installed locally on my laptop. Got wicket to build by changing the version numbers in both POMs associated with it and adding a wicket module to the base Sakai POM.

The SCORM app has nine different POMS scattered at different levels. All versions needed to be changed from M2 to 2.5.3 (and later 2.5.4 on the SoftChalk server). SCORM built ok in Maven, but I got an error on Sakai startup. I think the database tables are missing.

WARN: Unsuccessful schema statement: create table SCORM_ELEMENT_T (ELEMENT_ID bigint generated by default as identity (start with 1), CLASS_TYPE varchar(255) not null, PARENT bigint, VALUE varchar(255), IS_INITIALIZED bit, TRUNC_SPM bit, NAVIGATION_DM bigint, BINDING varchar(255), IS_RANDOMIZED bit, COUNT integer, primary key (ELEMENT_ID))

(2009-06-30 10:27:06,250 main_org.sakaiproject.springframework.orm.hibernate.AddableSessionFactoryBean) java.sql.SQLException:

Unexpected token: COUNT in statement [create table SCORM_ELEMENT_T (ELEMENT_ID bigint generated by default as identity (start with 1), CLASS_TYPE varchar(255) not null, PARENT bigint, VALUE varchar(255), IS_INITIALIZED bit, TRUNC_SPM bit, NAVIGATION_DM bigint, BINDING varchar(255), IS_RANDOMIZED bit, COUNT]

Created the table by editing the definition of ELEMENT_ID as follows:

mysql> create table SCORM_ELEMENT_T (ELEMENT_ID bigint(20) NOT NULL auto_increment,
CLASS_TYPE varchar(255) not null, PARENT bigint, VALUE varchar(255), IS_INITIALIZED bit,
TRUNC_SPM bit, NAVIGATION_DM bigint, BINDING varchar(255), IS_RANDOMIZED bit,
COUNT integer, primary key (ELEMENT_ID));

Query OK, 0 rows affected (0.31 sec)

Started tomcat up again. Failed again.

July 1, 2009
------------

(comments got lost)
Got SCORM working on softchalk.com. Issue was difference in Sakai version.
Upgraded local system to Sakai 2.5.4.
Requested a node on contrib for lbgateway.

July 2, 2009
------------

Need to work on expanding zips today.
I indended to re-use code from the cc-ocw project to unzip an archive. Took me a while to find it buried in the V3 OSID Filing impl. in fsArchiveDirectoryForm.

Got unzip to work after considerable effort.
Because ContentResource.setContent(InputStream) closes the stream after reading it, the ZipInputStream is closed after processing the first entry. This really sucks. So I had to write each entry to a temp file, read it out of the temp file into CHS, and then delete the temp. This works, but it's noticably slow. Added support for navigationFilename.

July 3, 2009
------------

To support SCORM, I need to somehow register an uploaded archive file. The ScormResourceService seems to provide this via:

String convertArchive(String resourceId, String title) throws InvalidArchiveException;

The trick will be to get an instance of ScormResourceService into lbgateway. There is no cover service, so I may have to try and get Spring to inject this into my app. Maybe I can coerce the Sakai ComponentManager to fetch it.

Added a doAddScorm2004Item() method. At the moment, it only uploads an archive into a specified collection. This is where I will experiment with convertArchive.

Jeff mentioned having problems with delete. I checked both delete item and folder locally and had no problem. There may be an issue on the Sakai server at softchalk.com, however.

Also checked to make sure I could still add a single item without error. Seems to work just fine.

Created a doTestSCORM2004Item to play with fetching the Scorm Resource Service. Got a class cast exception:

        • lbgateway - SCORM Test - Content name is: Sesame5
          java.lang.ClassCastException: org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService$$EnhancerByCGLIB$$d941d939
          at com.softchalk.lessonbuilder.servlet.LessonBuilderServlet.doTestScorm2

Looking deeper into the code, we find that SakaiResourceService doesn't implement ScormResourceService. WTF? Why have an interface if you don't use it. Worse, why register under an API id? Sigh. I could cast to this kind of object, but it's going to mean dragging the impl into my POM. Sent a msg to James Renfro at UC Davis complaining.

July 4, 2009 - Fireworks!
------------

Created a test-softchalk.htm file with links to the softchalk.server.
Tested delete again - seems to work just fine as long as the id is correct.
Tested addZipItem. Was able to upload the Sesame package without problem even if I named it Sesame.zip.
These are integration problems of some kind.

Returning to SCORM problem now.
Able to include the service impl as a dependency. Attempted to get the ScormResourceService and cast it to SakaiResourceService - still getting a class cast message. That "EnhancerByCGLIB" in the object identifier modifies the base object in some way. Also, SakaiResourceService is an abstract class. It's very likely that something extends this class, but god only knows where.

A google search on CGLIB reveals a SourceForge project that extends Java classes at runtime. Stands for Code Generation Library. I don't think I'm going to be able to figure this out without help fromm the UC Davis guys. Too bad, really. If I could get this object into my code, I think it would just work.

Sakai Conference
----------------

Lots of interest. Most want gradebook integration. A few desired to be beta partners. Sheila was happy with interest level and lead generation.

July 21, 2009
-------------

Got a list of things to do after talking with Jeff that includes:

Make replace work
Navigation link for add zip
Add SCORM item
Support for non-course items
Check product version
Custom ordering in a folder (move)
Rename folders (new feature)

Spoke with Jeff and Sue on the phone. Sue had an idea to do gradebook integration directly rather than wait for SCORM-GB integration. This could be done by adding REST command to lbgateway to get/post grades. Sue has asked me to create a proof of concept demonstration using a SoftChalk generated test (simple true/false) that uses JavaScript to post a grade via lbgateway.

Jeff game me a copy of this version of LessonBuilder that I can use to test lbgateway. Noticed a null pointer error right off and fixed it. Also added (back) a debug flag to show the request and response.

Created a simple true/false test case (OptimistTest). The test page uses frames (ugh).

In the Javascript file called q_functions.js, there is a line that reads "parent.frame_my_score[lessonID] = my_score". I think a REST grade update could be inserted here. Hmm. It depends on the question type. I have to figure out which type code a true/false question is. That, or make the update where the score is posted.

Spoke with Jeff about the need for a package type variable. This would allow me to collapse some REST commands into a single command (addItem, addZipItem, addSCORMItem, addSCORM2004Item) and also have a better idea of how to handle other operations like delete. Also, the code is now at 1400 lines in a single class. I think that's a bit much. Breaking out functions into handler objects would simplify the code somewhat. It's not a very OOD approach, but who cares, really. Hmmm. It could be handled along package type lines. An object for file, zip, SCORM 1.2, and SCORM 2004. Add one for courses and one for folders and we've got it pretty much covered. So:

LBGSite - operations on a course site, etc. (this will handled grading, later)
LBGFolder - operations on a folder
LBGFile - operations on a file
LBGZip - operations on a zip archive.
LBGScorm12 - operations on a SCORM 1.2 package
LBGScorm2004 - operations on a SCORM 2004 package

Later, we can add Common Cartridge and OCW packages, if needed.

Implemented LBGSite. Frankly, it took too long to implement. I'm going to have to refactor slowly as I just can't afford the time right now.

July 24, 2009
-------------

Of all of the new handler objects, LBGZip is probably the most important in the short run. That's worth developing right now. It can also serve as a template for File and the Scorm classes.

July 27, 2009
-------------

Coded LGBZip.add()

July 28, 2009
-------------

Code LGBZip.delete(). Added hooks into lbgateway servlet. Fixed test-links.html to put uploaded packages into BIO.
Tested add Zip item. Works. Navigation link added as well.

It's a bit tricky to coordinate add, replace, and delete operations. The question is what content id to use. The package is deployed using the "name" property given in the addItem request. The containing collection for the installed package is that name. Currently, the id of the launch link is the name of the file, and it's title is the package name. On reflection, that's not going to work since most packages will use "index.html" as the launch file and the id's will collide. Better to use the package name. That helps because then we don't have to cache the lauch file name for delete purposes. It can be derived from the content id passed to delete.

As a side note, when I tried to delete the uploaded package using the Resources tool, I got an Inconsistent exception. It may be that I am creating the collection wrong, somehow.

Changed code to create navigation link with an id that's the same name as the package. Tested, works.
Unable to catch the InconsistentException. I think it's being thrown and caught down in the CHS implementation. The stack trace is printed, but the exception never really reaches my code. Changed to code attempt the delete of collection twice. If the second time fails due to unknown id, it passes through.

Well, even an attempt at second delete didn't get rid of it. This is going to take some deeper debugging, perhaps looking into the CHS and see what's going on. Here is the top of the stack trace:

WARN: removeCollection(): (2009-07-28 12:05:53,940 http-8080-Processor23_org.sakaiproject.content.impl.BaseContentService)
org.sakaiproject.exception.InconsistentException id=/group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/
at org.sakaiproject.content.impl.BaseContentService.removeCollection(BaseContentService.java:2496)
at org.sakaiproject.content.impl.BaseContentService.removeCollection(BaseContentService.java:2575)
at org.sakaiproject.content.cover.ContentHostingService.removeCollection(ContentHostingService.java:283)
at com.softchalk.lessonbuilder.servlet.LBGZip.handleDelete(LBGZip.java:342)
at com.softchalk.lessonbuilder.servlet.LBGZip.delete(LBGZip.java:303)

Here is a clue from BaseContentService:

if (!members.isEmpty()) throw new InconsistentException(edit.getId());

My guess is that this exception is a singnal to higher level code that members of the collection need to be deleted before the collection can be deleted. It's possible that this ability was changed from the original behavior. Still, why print the stack trace? More clues:

// clear of all members (recursive)
// Note: may fail if something's in use or not permitted. May result in a partial clear.

Added a bunch of printfs with the following results:

        • LGBZip.handleDelete() - Just after first delete.
        • LGBZip.handleDelete() - Between deletes.
        • LGBZip.handleDelete() - IdUnused exception.
        • LBG: response: <?xml version="1.0" encoding="UTF-8"?>
          <response><callerIP>127.0.0.1</callerIP><success/></response>

Note that on the second attempt to delete the package collection, we get an IdUnused exception. I think this is a bug in Sakai, really. The same sort of problem happens in the Resource Tool, though it is possible to "unclear" the collection and then finally remove it.

Created LBGFolder. Implemented getFolders, getItems, getItemsAndFolders. All working.
Implemented add and rename. Both working.
Implemented delete folder. (not tested yet)
Implemented ZIP replace (not tested)

July 29, 2009
-------------

A bit of sluthing on Sakai-Jira turned up SAK-12126 that indicates that collections need to be deleted twice if it contains a sub-collection.

Add ZIP item was changed to return the id of the containing collection. I think that should suffice for a replace operation.
First test of ZIP replace returned a Permission error (as an error response embedded in an error response, which is weird).
It's failing in the delete collection. The only thing that happens before this delete request is fetching the launch file name stored as a property on the containing collection. Could this be causing a problem? I wouldn't think it would. I'll stub out this part of the code and see what happens. No different - still fails. Unstubbed code.

This problem points back to the delete issue. There is another approach; recursively delete the contents of a collection before deleting the collection entity itself. More code, but it should work (in theory).

Created a private method called deleteCollection().
New approach to delete collection is half working. It didn't delete the resources.

        • LBG: Request is http://localhost:8080/lbgateway/delete ? contentId=/group/1
          240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame&packageType=ZIP
        • LGBZip.handleDelete() - Just before delete collection.
        • LBGZip.deleteCollection() - number of items to delete is 48
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada_files/ - Collection
        • LBGZip.deleteCollection() - number of items to delete is 2
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada_files/ada_image1.html - Resource
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada_files/ada_image2.html - Resource
        • LGBZip.handleDelete() - Just after collection, before delete nav link.
        • LGBZip.handleDelete() - Just after nav link.
        • LGBZip.handleDelete() - Between deletes.
        • LBG: response: <?xml version="1.0" encoding="UTF-8"?>
          <response><callerIP>127.0.0.1</callerIP><success/></response>

Somehow, the top level loop is being exited after the first item (a sub-collection).

        • LBG: Request is http://localhost:8080/lbgateway/delete ? contentId=/group/1
          240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame&packageType=ZIP
        • LGBZip.handleDelete() - Just before delete collection.
        • LBGZip.deleteCollection() - number of items to delete is 48
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada_files/ - Collection
        • LBGZip.deleteCollection() - number of items to delete is 2
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada_files/ada_image1.html - Resource
        • LBGZip.deleteCollection() - bottom of the loop: 0
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada_files/ada_image2.html - Resource
        • LBGZip.deleteCollection() - bottom of the loop: 1
        • LGBZip.handleDelete() - Just after collection, before delete nav link.
        • LGBZip.handleDelete() - Just after nav link.
        • LGBZip.handleDelete() - Between deletes.

Not quite catching all the errors. Added a bit more code and now I'm getting a PermissionException:

        • LBG: Request is http://localhost:8080/lbgateway/delete ? contentId=/group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame&packageType=ZIP
        • LGBZip.handleDelete() - Just before delete collection.
        • LBGZip.deleteCollection() - number of items to delete is 48
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada_files/ - Collection
        • LBGZip.deleteCollection() - number of items to delete is 2
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada_files/ada_image1.html - Resource
        • LBGZip.deleteCollection() - bottom of the loop: 0
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada_files/ada_image2.html - Resource
        • LBGZip.deleteCollection() - bottom of the loop: 1
        • LBG: response: <?xml version="1.0" encoding="UTF-8"?>
          <response><error>PermissionError</error><callerIP>127.0.0.1</callerIP></response>

Added a stack trace on error:

org.sakaiproject.exception.PermissionException user=admin lock=content.delete.any resource=/content/group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada_files/
at org.sakaiproject.content.impl.BaseContentService.removeCollection(BaseContentService.java:2550)
at org.sakaiproject.content.cover.ContentHostingService.removeCollection(ContentHostingService.java:283)
at com.softchalk.lessonbuilder.servlet.LBGZip.deleteCollection(LBGZip.java:459)

It might be that this is the basis of the error in CHS w.r.t deleting collections that contain sub-collections. I will try ignoring the PermissionException and see what happens.

Well, that get's me closer, but not quite there. The containing folder and the navigation link are not being deleted. Debug info is now:

        • LBG: Request is http://localhost:8080/lbgateway/delete ? contentId=/group/1
          240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame&packageType=ZIP
        • LGBZip.handleDelete() - Just before delete collection.
        • LBGZip.deleteCollection() - number of items to delete is 48
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada_files/ - Collection
        • LBGZip.deleteCollection() - number of items to delete is 2
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada_files/ada_image1.html - Resource
        • LBGZip.deleteCollection() - bottom of the loop: 0
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada_files/ada_image2.html - Resource
        • LBGZip.deleteCollection() - bottom of the loop: 1
        • LBGZip.deleteCollection() - ignoring PermissionException.
        • LBGZip.deleteCollection() - bottom of the loop: 0
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/Sesame_sco2004.zip - Resource
        • LBGZip.deleteCollection() - bottom of the loop: 1
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada-access.gif - Resource
        • LBGZip.deleteCollection() - bottom of the loop: 2
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/ada-activity.gif - Resource
          ...
        • LBGZip.deleteCollection() - bottom of the loop: 45
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/XMLSchema.dtd - Resource
        • LBGZip.deleteCollection() - bottom of the loop: 46
        • LBGZip.deleteCollection() - item to delete is /group/1240e0c5-13b1-4802-8e2f-8927d80ab219/Lessons/Sesame/imsmanifest.xml - Resource
        • LBGZip.deleteCollection() - bottom of the loop: 47
        • LBG: response: <?xml version="1.0" encoding="UTF-8"?>
          <response><error>IdUnusedError</error><callerIP>127.0.0.1</callerIP></response>
          WARN: run(): ghost-busting server: localhost-1248881885721 from : localhost-1248
          882779908 (2009-07-29 11:54:19,002 SakaiClusterService.Maintenance_org.sakaiproj
          ect.cluster.impl.SakaiClusterService)

July 30, 2009
-------------
I moved the final delete of the containing collection up out of deleteCollection() into handleDelete(). I'm pretty confident that deleteCollection() removes all elements of the collection so that it can be finally deleted. However, I'm getting an IdUnusedException on the final delete. When I turn on the stack trace, I get:

org.sakaiproject.exception.IdUnusedException id=/group/1240e0c5-13b1-4802-8e2f-8
927d80ab219/Lessons/Sesame/ada_files/
at org.sakaiproject.content.impl.BaseContentService.editCollection(BaseContentService.java:4121)
at org.sakaiproject.content.impl.BaseContentService.removeCollection(BaseContentService.java:2562)
at org.sakaiproject.content.impl.BaseContentService$BaseCollectionEdit.clear(BaseContentService.java:10228)
at org.sakaiproject.content.impl.BaseContentService.removeCollection(BaseContentService.java:2572)
at org.sakaiproject.content.cover.ContentHostingService.removeCollection(ContentHostingService.java:283)
at com.softchalk.lessonbuilder.servlet.LBGZip.handleDelete(LBGZip.java:396)

Note that RemoveCollection() thinks that it is trying to delete /ada_files/ in spite of being passed an id of /Sesame/. Sigh, another CHS bug. This is maddening. Something is getting cached deep down. Likely these problems are all related, but that doesn't matter to me. I am hoping that there is a way to cause the CHS to release it's cache or clear state, somehow.

The exception is being generated in BaseContentService (line 4121). This is in editCollection()

// check for existance
if (!m_storage.checkCollection(id))

Unknown macro: { throw new IdUnusedException(id); }

I might be able to get the collection for edit, and then cancel it - hopefully clearing any caches as a side effect.

This code is from removeCollection:

// find the collection
ContentCollection thisCollection = findCollection(id);
if (thisCollection == null) throw new IdUnusedException(id);

// check security: can we remove members (if any)
// Note: this will also be done in clear(), except some might get deleted before one is not allowed.
// unlockContained(AUTH_RESOURCE_REMOVE, thisCollection);

// get an edit
ContentCollectionEdit edit = editCollection(id);

The exception is thrown from the call to editCollection. Note that the calll to findCollection(id) succeeded only a few lines before. How could this be? Here is another clue:

        • LBGZip.handleDelete() - package collection should be empty. Number of entries: 23

This happens after the call to deleteCollection(). This suggests that CHS thinks that the containing collection still has entries in it, in spite of having been just deleted. Something is not getting updated. Here is the code that implements getMemberCount:

int count = 0;
Integer countObj = (Integer) ThreadLocalManager.get("getMemberCount@" + this.m_id);
if(countObj == null)

Unknown macro: { count = m_storage.getMemberCount(this.m_id); ThreadLocalManager.set("getMemberCount@" + this.m_id, new Integer(count)); }

else

Unknown macro: { count = countObj.intValue(); }

return count;

Note that the thread manager is being used to cache the member count. Wellll, it would be a TOTAL hack, but I could overwrite this number and make it zero after deleting members of the collection. There is even an example of the set code right there. I just need to do:

ThreadLocalManager.set("getMemberCount@" + packageCollectionId, new Integer(0));

Added line to handleDelete(). Sadly, it didn't work. I get the same IdUnusedException. I had high hopes for this hack, it's REALLY too bad it didn't work. Just to make sure, I uncommented the lines to show the number of entries in the collection:

        • LBGZip.handleDelete() - package collection should be empty. Number of entries: 0

So the hack worked (as far as it goes), but didn't solve the problem.

Aug 1, 2009
-----------

Steve Marquard mentioned on the dev list that the CHS delete collection problem was fixed in the 2.5.x branch. I started to build this version of Sakai locally yesterday, but I decided that it would be better to see if it was fixed in 2.5.5, since that is more likely to be a verison in production than 2.5.x. Downloadinng and building it takes about 1.5 hours. I'm now at the point to build wicket, scorm, and softchalk.

Failed. Grr. I'm going code up a simple collection delete just to make sure, but it looks like I have to grab the CHS service from 2.5.x to make this work.

I had dropped the MySQL connector JAR from the tomcat build. Also the sakai properties. Added them both back.

Aug. 2, 2009
------------

Confirmed that Inconsistency error is still present in Sakai 2.5.5. The next step is to attempt to update just the CHS service in the 2.5.5 environment and see if the fix is in.

Copied sakai_2.5.x/content to sakai_2-5-5. Updated version numbers (M2 to 2.5.5) in 15 POM files (ugh).
Tested against simple delete method. Works - Finally!

Unfortunately, this means that 2.5.x customers of Sakai will have to update their CHS to the 2.5.x trunk. Some are not going to like that, especially for just a beta test. Still, it can't be helped. The fixes are also present in 2.6.0, but not many Sakai installations haver updated to the new version yet (inspite of it taking more than a year to release).

Aug. 3, 2009
------------

Finished up work on Zip replace. I had to break out the add item logic into handleAdd so that I could pass parameters into it. They differ between add and replace, so I needed a way to work up the right parameters and pass them. I tried adding them to the servlet request, but they can only be added as attributes, not parameters. Whatever. It seems to work just fine now.

Updated code on softchalk.com to Sakai 2.5.5. Added wicket, scorm2004, and lbgateway. Built basic Sakai - no errors.

Aug. 4, 2009
------------

Built wicket, scorm2004, and lbgateway - no errors.
However, on Tomcat startup, I'm getting two serious errors;

  • Address 8080 in use
  • SQLException: Access denied for user 'sakai@127.0.0.1' using password YES.

There are a couple of java tasks running on the server, but there isn't any way to tell what each of them are.
The sakai.properties file wasn't updated to the correct database name and password (User name is 'godwinyj_11' and the password is 'A4zmQwgA'). Copied the properties file from the old tomcat and also put it into the master copy under ~/mark/apache-tomcat-5.5.26.

Server is up and seems to be working.

Turning the challenge of getting SCORM integration to work, I need to find a way to inject an instance of ScormResourceService.
Here is a snip from applicationContext.xml from the ScormTool:

<bean id="toolWicketApplication" class="org.sakaiproject.scorm.ui.player.ScormTool">
<property name="resourceService"><ref bean="org.sakaiproject.scorm.service.api.ScormResourceService"/></property>
</bean>

Created an applicationContext.xml file in WEB-INF for lbgateway:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">

<beans>
<!-- Inject a reference to the SCORM Resource Service -->
<bean id="lbgatewayApplication" class="com.softchalk.lessonbuilder.servlet.LessonBuilderServlet">
<property name="scormService"><ref bean="org.sakaiproject.scorm.service.api.ScormResourceService"/></property>
</bean>
</beans>

Added the following listener to web.xml in the hopes that it will cause Spring to perform the injection:

<listener>
<listener-class>org.sakaiproject.util.ContextLoaderListener</listener-class>
</listener>

Aug. 5, 2009
------------

Continue to have problems injecting the ScormResourceService into lbgateway app. After email conversations with James Renfro, I tried to modify the existing code to eliminate the type mis-match error that I'm seeing:

org.springframework.beans.TypeMismatchException: Failed to convert property value of type [org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService$$EnhancerByCGLIB$$544aa352] to required type [org.sakaiproject.scorm.service.api.ScormResourceService] for property 'scormService'; nested exception is java.lang.IllegalArgumentException: Cannot convert value of type [org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService$$EnhancerByCGLIB$$544aa352] to required type [org.sakaiproject.scorm.service.api.ScormResourceService] for property 'scormService': no matching editors or conversion strategy found
Caused by:
java.lang.IllegalArgumentException: Cannot convert value of type [org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService$$EnhancerByCGLIB$$544aa352] to required type [org.sakaiproject.scorm.service.api.ScormResourceService] for property 'scormService': no matching editors or conversion strategy found

The original code has an abstract class called SakaiResourceService that extends AbstractResourceService. AbstractResourceService implements the ScormResourceService interface. I thought that adding "implements ScormResourceService" to SakaiResourceService might do it, but that doesn't eliminated the problem. I thought the fact that SakaiResourceService was abstract was causing a problem (a rather strange construct, if you ask me), but that caused problems in some of the internal methods. Fixed these abstract methods by making calls to the corresponding Sakai service covers. That compiled, but still didn't fix the problem.

Looking back at my applicationContext.xml file, I note that we inject a reference to a bean:

<ref bean="org.sakaiproject.scorm.service.api.ScormResourceService"/>

I belive there is a way to inject a new instance of SakaiResourceService, but I don't quite recall how. Some research is called for.
I tried this approach:

<bean id="scorm" class="org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService;"/>

<!-- Inject a reference to the SCORM Resource Service -->
<bean id="lbgatewayApplication" class="com.softchalk.lessonbuilder.servlet.LessonBuilderServlet">
<property name="scormService" ref="scorm" />
</bean>

Well, that didn't work out so well:

ERROR: Context initialization failed (2009-08-05 11:28:08,770 main_org.springframework.web.context.ContextLoader)
org.springframework.beans.factory.CannotLoadBeanClassException: Cannot find class [org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService;] for bean with name 'scorm' defined in ServletContext resource [/WEB-INF/applicationContext.xml]; nested exception is java.lang.ClassNotFoundException: org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService;

The SCORM bean definition is in scorm-impl/pack/../spring-sakai-conntent.xml:

<!-- The version that makes use of the Sakai repository to store content, but serves it through the tool -->
<bean id="org.sakaiproject.scorm.service.api.ScormResourceService"
class="org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService"
singleton="true">

<lookup-method name="configurationService" bean="org.sakaiproject.component.api.ServerConfigurationService" />
<lookup-method name="contentService" bean="org.sakaiproject.content.api.ContentHostingService" />
<lookup-method name="toolManager" bean="org.sakaiproject.tool.api.ToolManager" />

</bean>

That definitions looks just right. It also explains how access to the various Sakai service managers is done.

Aug. 9, 2009
------------

Implemented the LBGFile handler object.
AddItem is working
Delete gives permission error.

Aug. 10, 2009
-------------
Delete problem turned out to be the wrong id.
Delete file item now works.
Replace file works.
Rename file works.
Get file works.

I think it is time to strip out the old code from the LessionBuilderServlet.
Builds with all of the old handlers removed.
This effectively completes the refactoring of the LB gateway application.

What remains is:

  • Support for SCORM 2004 packages
  • Custom orders (moving things around)

August 13, 2009
---------------

Jeff found a number of bugs that I'm trying to track down.
Tried to upload a zip to softchalk.com - failed with Collection Not Found error.
This turned out to be a problem with my test links, now fixed.
Tested ZIP upload on localhost and seems to work fine.
Tested ZIP upload on softchalk.com and it also seems to work ok.

Jeff reported the following bugs:

1. Can't rename a folder. PermissionError.
2. Zip upload, ArgumentMissingError. launchFilename not launchFileName
3. URL added as well points to localhost rather than the instance into which the zip was added.
4. No launch file created. IdUnusedError

Bug #1 can't seem to reproduce.
There were some problems in my test links, but once fixed, I could create a folder and rename it.

Bug #2 just seems to be an error message problem.
The check for the luanchFilename parameter is attempting to get the right name, but the error message reported launchFileName missing.
Fixed error message in v0.11c.

Bug #3 does reproduct giving a URL of:
http://localhost:8080/access/content/group/3e26b0e1-bfed-45e7-9714-ba8cbf49482d/Lessons/Sesame/index.html
An interesting problem resulting from using "String launchUrl = launchRsrc.getUrl()". I would think that Sakai would give me back a usable URL, but in fact it's usable only from within the server, not from a client.
Had to build URL using the java.net.URL class. Tested locally and works. Need to test on SoftChalk.com server. Tested and working. v0.11c posted to softchalk.com.

Bug #4 doesn't seem to reproduct, at least from my test links.
Likely, this is related to Bug #3. It seems to be working after fixes made.

========================================================================

James Renfro replies:

Hi Mark,
So I had a chance to load up your demo servlet and I see several issues
that explain why you're not able to inject the ScormResourceService into
your class.

(1) The scorm-api is not a "provided" dependency in the pom.xml – this
means that the tool will have its own copy in WEB-INF/lib and the
ClassLoader will be confused about which one to use, since there's also
one in shared/lib

(2) The scorm-impl is included as a dependency in your pom.xml – this
shouldn't be necessary since the API interface is all you need to call
the exposed methods in the interface, and including the scorm-impl in
your WEB-INF/lib is both unnecessary and could also potentially confuse
the ClassLoader.

(3) You're referencing the implementation class SakaiResourceService
rather than the interface ScormResourceService . . . in Sakai's Spring
configuration, it's the interfaces that get shared via the shared/lib in
Tomcat, not the implementation classes, which are located in components
and shared between webapps via the ComponentManager behind the scenes.

I'm attaching an amended version of the code you sent that now compiles
and deploys to Sakai correctly. Note that I modified the <version> of
scorm in the pom.xml to M2 rather than $sakai.version. You'll want to
change that back, I imagine.

Hope that helps. Best of luck,
James.

August 16, 2009 - Still in San Jose
---------------

Uncommented entry in components.xml.
Changed dependency in POM to include a provisioned line.
However, there must be more to it because:

        • LBGServlet: SCORM service is null.

Cleaned up my version of lbgateway.
Built Jame's version of lbgateway.
It also fails.

This says that there is likely a problem in the way that SCORM is being built.
I think that James downloaded the whole SCORM2004 component and built it under the M2 version. That's worth a try, I guess.
I need the M2 version of wicket, too. Sigh.
Both wicket and scorm2004 build under the M2 version.
Rebuilt softchalk-renfro.
Still doesn't work. This is not good.
Started it up under Cygwin. No error messages in catalina.out. That's kinda weird.

Some things to try:
Add a printf in the set method that initializes the gateway.
Try using the ComponentManager again.

Compared some files:
web.xml files are identical
applicationContext.xml files are identical
pom.xml files different in SCORM dependency version numbers:

Unknown macro: {sakai-version}

and 2.5.5

Added a bunch of printfs in the gateway.init() method, in the scormService setter, and at the request procssing. This is what happens when the gateway component is created by tomcat:

INFO: Initializing Spring root WebApplicationContext (2009-08-16 16:42:18,515 main_org.apache.catalina.core.ContainerBase.[Catalina].[localhost].[/lbgateway])

        • LBGServlet: setScormService() - org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService$$EnhancerByCGLIB$$de685231@1ad8678
        • LBGServlet: SCORM service null at gatway init.

Note that the setScormService method is being called, but then the service is null at init() time. Later, when we make the first request:

        • LBGServlet: SCORM service is null.
        • LBGServlet: Got SCORM service from ComponentManager.
        • LBG: Request is http://localhost:8080/lbgateway/login ? name=admin&password=admin
        • LBG: response: <?xml version="1.0" encoding="UTF-8"?>
          <response><callerIP>127.0.0.1</callerIP><user>admin</user><sessionId>71d1d869-2475-4b07-9bc2-48d43b8654a3</sessionId></response>

At the start of the request, the service is still null. Injection has failed. Actually, it didn't really fail. I've seen this kind of behavior before. An instance of lbgateway was created and another created later. The first gets initialized, the second doesn't.

The truly important thing is that the ComponentManager now seems to be able to get an instance of the service. I don't really care how I get access to the service as long as I get it. This is a breakthrough.

Stripped out debug strings. In the process of migrating access to Scorm service to the LBGScorm2004 object when it is constructed.
If the basic addItem call works, I will strip out the Spring injection, sinnce it isn't working anyways.
Added SCORM2004 package type to addItem command entry. Calls out to LBGScorm2004.add().
Got the stubbed success result. That means that the Scorm service is successfully gotten in the LBGScorm2004 handler object.

Commented out injection in applicationContext.xml. Removed object variable in LessonBuilderServlet.

At this point, I can finally start coding SCORM 2004 publishing from LessonBuilder. An LBGScorm2004 object was previously created.
The easiest way to implementat addItem(SCORM) seems to be:

ScormResourceService
public String putArchive(InputStream stream, String name, String mimeType, boolean isHidden);

It would be way too simple to expect this to just work:

        • LBG: Request is http://localhost:8080/lbgateway/addItem ? folderId=MyCourse
          &contentId=/group/6db1b7f2-3051-43f3-9f3f-874ceafb1dff/&name=Sesame&launchFilena
          me=index.html&packageType=SCORM2004
          java.lang.NullPointerException
          at org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService.getRootDirectoryPath(SakaiResourceService.java:217)
          at org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService.putArchive(SakaiResourceService.java:154)
          at com.softchalk.lessonbuilder.servlet.LBGScorm2004.handleAdd(LBGScorm2004.java:178)

I should probably dig a bit to confirm, but I suspect that the SakaiResourceService is trying to get the root directory for the site of the active tool. After all, this IS being called from the ScormTool, right? Well, maybe not (definately not, in fact). Here is the implementation of getRootDirectory() in SakaiResourceService:

protected String getRootDirectoryPath()

Unknown macro: { String siteId = toolManager().getCurrentPlacement().getContext(); String collectionId = contentService().getSiteCollection(siteId); return collectionId; }

This might be tough to work around. The parameters to putArchive don't include a path of any kind. It assumes that the context of the active tool is known (site id). The other approach is to use convertArchive():

String convertArchive(String resourceId, String title) throws InvalidArchiveException

This (presumably) converts an archive that already exists in Content Hosting Service. A superficial scan through the code doesn't reveal any attempts to get the root via the site id.

Sadly, if you drill deep enough, the SCORM service wants to get the root directory again:

        • LBG: Request is http://localhost:8080/lbgateway/addItem ? folderId=/group/6
          db1b7f2-3051-43f3-9f3f-874ceafb1dff/&name=Sesame&launchFilename=index.html&packa
          geType=SCORM2004
          java.lang.NullPointerException
          at org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService.getRootDirectoryPath(SakaiResourceService.java:217)
          at org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService.getContentPackageDirectoryPath(SakaiResourceService.java:224)
          at org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService.getResourcePath(SakaiResourceService.java:142)
          at org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService.newItem(SakaiResourceService.java:354)
          at org.sakaiproject.scorm.service.impl.AbstractResourceService.unpackEntry(AbstractResourceService.java:95)
          at org.sakaiproject.scorm.service.impl.AbstractResourceService.unzip(AbstractResourceService.java:104)
          at org.sakaiproject.scorm.service.impl.AbstractResourceService.unpack(AbstractResourceService.java:83)
          at org.sakaiproject.scorm.service.impl.AbstractResourceService.convertArchive(AbstractResourceService.java:22)
          at org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService.convertArchive(SakaiResourceService.java:52)
          at com.softchalk.lessonbuilder.servlet.LBGScorm2004.handleAdd(LBGScorm2004.java:184)

Two strikes, one more and I'm out. About the only thing left (short of significant changes to the SCORM service itself) is to attempt to force the tool manager into thinking that its the Scorm tool in this particular course that active. Frankly, that's a pretty serious hack, but it might work. The lbgateway has it's own user session, created by the login. I might run into security problems, however. I tend to run these tests as admin, but lesser mortals might not have the security permissions. This needs testing later.

Service Manager research:
The SessionManager has a setCurrentToolSession().
ToolManager and ActiveToolManager both do not have any means to set the active tool.
Nothing in SiteService

ToolManager.getCurrentPlacement() returns a Placement.
Placement.getContext() returns a string, presumably a reference to the current site.

It is possible to create a new Placement, that includes the site context:

Placement(String id, String toolId, Tool tool, Properties config, String context, String title)

Question is, How does this placement become the current placement? Here is an interesting clue in the ToolManager implementation:

protected final static String CURRENT_PLACEMENT = "sakai:ToolComponent:current.placement";

The current placement is a key. That implies that it's being kept as a property somewhere.
More clues in this implementation:

public Placement getCurrentPlacement()

Unknown macro: { return (Placement) threadLocalManager().get(CURRENT_PLACEMENT); }

Where is the ThreatLocalManager? It's in the util component and there is a cover for it. Furthermore, properties can be set:

void set(String name, Object value);

If I call this with ToolManager.CURRENT_PLACEMENT and set it to a Placement that I create, it should do the trick. The placement is going to want a valid Sakai tool, however. Currently, the lbgateway is not a Sakai tool. I think I ran into problems with request routing or processing of the request parameters. Oh, I recall, it was problems with Apache File Upload. Frankly, I suspect it really doesn't matter what tool gets added. I might use the SCORM tool, for instance. It needs to be present for this to work anyways.

Current Placement needs to reflect which site is being accessed, so it needs to be updated on any request handled by LBGScorm2004.

I think I will stub out LBGScorm2004.replace() as a place where I can experiment with Tool Placement. Interestingly, this would allow me to use putArchive(), too.

Ok, the basic framework is in place now. Initial return is:

        • LBG: Request is http://localhost:8080/lbgateway/replaceItem ? courseId=courseId=6db1b7f2-3051-43f3-9f3f-874ceafb1dff&packageType=SCORM2004
        • LBGScorm2004.placement() - Course Id is: courseId=6db1b7f2-3051-43f3-9f3f-874ceafb1dff
        • LBGScorm2004.placement() - Current Placement is: null
        • LBG: response: <?xml version="1.0" encoding="UTF-8"?><response><error>Stubbed Success</error><callerIP>127.0.0.1</callerIP></response>

As expected, the current placement is null.
The placement is going to require the following elements:

  • A placement id (could be made up, I think)
  • A tool id (sakai.scorm.tool)
  • A tool object (ScormTool)
  • Properties (can be null)
  • A context (site id, though it many need to be in the form of a reference)
  • A context title (made up)

Getting the ScormTool yeilds:

        • LBGScorm2004.placement() - Scorm Tool is: org.sakaiproject.tool.impl.ActiveToolComponent$MyActiveTool@7a541135

which looks good. To create a Placement, however, I'm going to have to include the tool impl component.
Based on this:

        • LBGScorm2004.placement() - Course Id is: courseId=6db1b7f2-3051-43f3-9f3f-874ceafb1dff
        • LBGScorm2004.placement() - Current Placement is: null
        • LBGScorm2004.placement() - Scorm Tool is: org.sakaiproject.tool.impl.ActiveToolComponent$MyActiveTool@7a541135
        • LBGScorm2004.placement() - New Current Placement is: org.sakaiproject.util.Placement@cee4249f

It looks like I've managed to create a new placement and make it the active one. While it's possible that I've created a flawed placement, I'm not too worried about it. It is only associated with the session thread, which in this case is the lbgateway thread. Furthermore, all of the other request handers don't care about tool placement. This is purely a hack to get around the fact that the Scorm Service is looking for the site context by assuming that there is an active tool.

We'll rename this method to forcePlacement() and use it as needed.

Name in putArchive needs to be a file name.

Well, not complete success, but a partial one. putArchive() seems to be called successfully. The zip archive for the scorm package is added at the root of the course site collection and it is hidden. However, it doesn't show up in the ScormTool.
I might have to use convertArchive() after all, since it at least unpacks the zip archive. I don't see anything in there about registering it with the ScormTool, however.

I renamed the Sesame package to Numbers and uploaded it into the GEO-319-s3 course. It shows up as a zip package at the root, called Numbers_sco2004.zip. It also appears in the SCORM tool as Sessame - which is the original package name I gave it. I suspect that this name is extracted from the manifest when it is registered with the SCORM too. I'll have to read the ScormTool code and see how it uploads and registered a SCORM package.

Aug. 17, 2009 - San Jose to Ithaca
-------------

One difference between putArchive() and convertArchive() is that the later unpacks the ZIP. While I can't see any kind of registration, there might be some kind of hidden side effect that causes registration to happen.

Had a look at the ScormTool sources.
FileUploadForm calls putArchive() with the upload file stream.
Checked serveral files. Only putArchive() seems to be called.
For future reference, this seems to be how to remove a package:

contentService.removeContentPackage(contentPackageId);

So far, I have only seen reference to the ScormResourceService and the ScormContentService.
Looking over the ContentPackage API, it occurs to me that my problem might be as simple as setting an availability date.
I was going to try and update the ContentPackage with a release Date, but ran into a problem in the ContentService. The getContentPackage() method takes a content package Id as a Long, which means it is NOT the same as the CHS resource Id. Furthermore, I have no idea how to get that id. I can try setting a release Date on the resource itself and hope for the best.

Created a showPackages() method in LBGScorm2004 and tied it to the get request.
The results indicated no SCORM packages, which is a bit odd, since I uploaded at least one by hand into the SCORM tool. Why is it that everything about the SCORM package and service is hard? Even the simplest thing is way harder than it should be or just doesn't work at all. Grr.

I think I'm going to just have to ask James again for help. There is something simple that I'm missing.

Aug. 18, 2009
-------------`

Reply from James concerning registering archives:

Yes, you have to call convertArchive() after putArchive().

Its' been a while since I was in the code, but I think you want to call ScormResourceService.getResources() to retrieve the actual files (packages).

You'll also need the Resources tool in the site in order to make the ScormTool work for that site.

Aug. 19, 2009
-------------

ScormResourceService.getResources() takes a uuid as an argument. There is no documentation. It makes a call to getContentPackageDirectoryPath(), which is implemented as:

return new StringBuilder(getRootDirectoryPath()).append(uuid).append("/").toString();

Based on that, I believe that the parameter really isn't a resource id, but rather is a path to a collection, such as "/Lessons". It also looks like null is acceptable argument, causing it to search the whole site collection. Note also that forcePlacement() needs to be called before using this method, since it also depends on the active tool in the current tool placement.

Turns out I was wrong about passing null as an argument:

        • LBG: Request is http://localhost:8080/lbgateway/get ? courseId=courseId=6db
          1b7f2-3051-43f3-9f3f-874ceafb1dff&packageType=SCORM2004
          ERROR: Caught an exception looking for content packages (2009-08-19 13:17:53,670 http-8080-Processor25_org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService)
          org.sakaiproject.exception.IdUnusedException id=/group/courseId=6db1b7f2-3051-43f3-9f3f-874ceafb1dff/null/
          at org.sakaiproject.content.impl.BaseContentService.getCollection(BaseContentService.java:2110)
          at org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService.getContentResourcesRecursive(SakaiResourceService.java:268)
          at org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService.getResources(SakaiResourceService.java:150)
          at com.softchalk.lessonbuilder.servlet.LBGScorm2004.getPackages(LBGScorm2004.java:495

So if I pass in "Lessons" as a folderName in addition to the courseId, I get back:

<response>
<callerIP>127.0.0.1</callerIP>
<pacakge>
/group/6db1b7f2-3051-43f3-9f3f-874ceafb1dff/Lessons/test.txt
</pacakge>
</response>

I don't really understand why I get back test.txt as a package, but at least something is happening. The Lessons folder doesn't have any packages in it currently. Time to switch focus on add expandArchive() to the upload.

If I give "GEO-319-s3 Resources" as a folderName, I get no results.

Ok, package is uploaded, but not added to the folder I specified. It's going in at the root.. It is expanded, but is not showing up in the ScormTool. This actually agrees with my reading of the implementation code. I didn't see any place in convertArchive() that suggested that it was being registered with the ScormTool.

Aug. 20, 2009
-------------

Still trying to figure out how to register a SCORM package. No word from James Renfro - maybe he hates me.
Searching through ScormTool files for convertArchive() on the theory that it's needed to expand the package.

putArchive() is called in UploadPage, followed by:

int status = contentService.validate(resourceId, false, isFileValidated());

It's possible that this is one that does it, likely as a side effect. If you think about it, how can the tool register a package that doesn't validate?
Lo! and Behold! There, buried in a call in ScormContentServiceImpl.validate() is a call to convertToContentPackage(), which has the following comment:

/**

  • Takes the identifier for a content package that's been stored in the content repository
  • and creates the necessary objects in the database to make it recognizable as a content
  • package.
    */
    private void convertToContentPackage(String resourceId, IValidator validator, IValidatorOutcome outcome) throws Exception {

So as long as the package validates, it will get registered. This still doesn't explain why packages uploaded by hand where not visible to getResource(), but let's burn one bridge at a time.

Ah, victory is sweet. Validate() was the missing piece. I am able to upload a SCORM package that appears in the ScormTool list. It validates correctly and I'm able to launch it. I'm going to back this up and post it to SoftChalk.com as version 0.12.

Aug. 21, 2009
-------------

Deleting a Content Package looks like it's completely handled by ScormResourceService.removeResources().
Coded up delete and replace. Cleaned up addItem. Cleaned up built-in help.

I should convert dependency on content to the M2 version number, like wicket and SCORM. This will simplify Sakai installation.

Tested delete, it fails.

Aug. 25, 2009
--------------

More research into why LBGScorm2004.delete() fails:

WARN: An underlying collection or resource was not properly removed, causing this collection remove to fail: /group/6db1b7f2-3051-43f3-9f3f-874ceafb1dff//group/
6db1b7f2-3051-43f3-9f3f-874ceafb1dff/Sesame_sco2004.zip/ (2009-08-25 15:09:48,04
5 http-8080-Processor24_org.sakaiproject.scorm.service.sakai.impl.SakaiResourceS
ervice)
org.sakaiproject.exception.IdUnusedException id=/group/6db1b7f2-3051-43f3-9f3f-8
74ceafb1dff//group/6db1b7f2-3051-43f3-9f3f-874ceafb1dff/Sesame_sco2004.zip/
at org.sakaiproject.content.impl.BaseContentService.getCollection(BaseContentService.java:2110)
at org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService.removeResourcesRecursive(SakaiResourceService.java:231)
at org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService.removeResources(SakaiResourceService.java:199)
at com.softchalk.lessonbuilder.servlet.LBGScorm2004.handleDelete(LBGScorm2004.java:272)

So when the package is created, it returns "Sesame_sco2004.zip", but when I return to remove these resources, it throws the above error because that's not the id of a collection.

Changing the link command to:

http://localhost:8080/lbgateway/delete?courseId=6db1b7f2-3051-43f3-9f3f-874ceafb1dff&contentId=/group/6db1b7f2-3051-43f3-9f3f-874ceafb1dff/Sessame/&packageType=SCORM2004

Doesn't seem to help, though I would think that it would. Further playing around with the URL didn't help. Finally, some digging reveals that the id of the collection called "Sessame" at the root of this course is actually:

/group/6db1b7f2-3051-43f3-9f3f-874ceafb1dff/7fb90b03-8691-4869-be3c-4f44b7d8f567/

The SCORM service created a GUID to uniquely name the collection used to contain the expanded set of SCORM materials, but DOESN'T return it once created. Just for grins, let's put this reference id into the request and see if it will actually delete it.

Well, it took a while and then returned a ResourceNotDeletedError, based on:

ERROR: Unable to remove archive: /group/6db1b7f2-3051-43f3-9f3f-874ceafb1dff/7fb
90b03-8691-4869-be3c-4f44b7d8f567/ (2009-08-25 15:30:31,139 http-8080-Processor2
2_org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService)
org.sakaiproject.exception.PermissionException user=admin lock=content.delete.an
y resource=/content/group/6db1b7f2-3051-43f3-9f3f-874ceafb1dff/7fb90b03-8691-486
9-be3c-4f44b7d8f567/
at org.sakaiproject.content.impl.BaseContentService.removeCollection(BaseContentService.java:2551)
at org.sakaiproject.scorm.service.sakai.impl.SakaiResourceService.removeResources(SakaiResourceService.java:204)
at com.softchalk.lessonbuilder.servlet.LBGScorm2004.handleDelete(LBGScorm2004.java:272)

This shouldn't fail due to a permission exception, I don't think. After all, I can delete it using the ResourceTool.
Even though it indicated failure, the collection and it's contents were (in fact) removed. Weird. It was not removed as an entry in the ScormTool.

The SCORM tool uses a different approach to delete a package:

contentService.removeContentPackage(contentPackageId);

I suspect the only way to get at the content package id (which is a Long) is via

contentService.getContentPackages();

I tried setting this up before and didn't get any results. Back to the drawing board, I guess.

Not sure what I did differently this time than before, but I got the "get" command to work for Scorm2004. It currently shows:

<response>
<callerIP>127.0.0.1</callerIP>
<package>Sessame</package>
</response>

I think I'm going to spend a bit more time on this and change the command name to getAll.
Ok, the output is now:

<response>
<callerIP>127.0.0.1</callerIP>
<packages>
<package>
<title>Sessame</title>
<id>4</id>
</package>
</packages>
</response>

So, this means that there is now a requirement that the title of the content package must match the name parameter passed in addItem. Part of the manifest reads:

<organizations default="SC">
<organization identifier="SC">
<title>Sessame</title>
<item identifier="SC-Lesson" identifierref="L1">
<title>Sessame</title>
</item>
...

Aug. 26, 2009
-------------

I noted that there is a Content Package method to get the resource id. Added it and URL to getAll to produce:

<response>
<callerIP>127.0.0.1</callerIP>
<packages>
<package>
<title>Sessame</title>
<packageId>4</packageId>
<resourceId>a4aee400-2724-4cb5-83d8-a1c57e37bc16</resourceId>
<URL/>
</package>
</packages>
</response>

The resource id given allows me to construct the full CHS reference id needed locate resources managed by the Scorm Service.

Back to the delete question. Given a course id and the true Scorm package name, I should be able to delete a Content Package.
The news is bascially good: Delete looks up the scorm package id and deletes it using contentService.removeContentPackage(name). The record in the ScormTool is removed (which is the main hurdle I was looking to clear) and the expanded zip collection is deleted. Two problems remain: the original zip file is still present, which needs to be cleaned up and removeContentPackage() throws ResourceNotDeletedException in spite of the fact that everything seems to deleted. The first can be handled by basic CHS.removeResource. The second will just have to be ignored, I guess.

Tested delete and it seems to work just fine. Assuming that replace should work, too, since it is just a delete then add.
Copied over get() from LBGFile.

This should complete coding of the LB Gateway application, unless move is strongly desired.

Aug. 27, 2009
-------------

Zip upload debugging.

        • LBG - Launch point id is: /group/10b31b88-f5a4-4e49-a1e3-095d1d57b11e/Music
          180/index.html
          org.sakaiproject.exception.IdUnusedException id=/group/10b31b88-f5a4-4e49-a1e3-0
          95d1d57b11e/Music180/index.html
          at org.sakaiproject.content.impl.BaseContentService.getResource(BaseContentService.java:4086)
          at org.sakaiproject.content.cover.ContentHostingService.getResource(ContentHostingService.java:624)
          at com.softchalk.lessonbuilder.servlet.LBGZip.handleAdd(LBGZip.java:247)

Nov. 30, 2009
-------------

Attempting to create an academic term in Sakai database:

insert into CM_ACADEMIC_TERM_T (TITLE, DESCRIPTION, START_DATE, END_DATE) values ("Trial Term", "Use this term to create test courses.", "2009-01-01", "2015-12-31");

Worked fine.

  • No labels