Error 500 No Selector Attribute Cookie/header/name/parameter Was Specified
This Site Careers Other all forums Forum: Struts Use of Logic equal tag Parijat Mukherjee Greenhorn Posts: 26 posted 12 years ago I'm getting this error : E SRVE0026E: [Servlet Error]-[No selector attribute ;cookie/header/name/parameter; was specified]: javax.servlet.jsp.JspException: No selector attribute (cookie/header/name/parameter) was specified at org.apache.struts.taglib.logic.CompareTagBase.condition(CompareTagBase.java:249) when i'm trying to reload (in sorted form )my jsp page CAN SOMEONE HELP PLEASE AS IT IS URGENT poornima balagopal Ranch Hand Posts: 83 posted 12 years ago Hi, I am not getting what you trying to do ? If you are trying to use logic:equal tag then its use is like as follows
for Help Receive Real-Time Help Create a Freelance Project Hire for a Full Time Job Ways to Get Help Ask a Question Ask for Help Receive Real-Time Help Create a Freelance Project Hire for a Full Time Job Ways to Get Help Expand Search Submit Close Search Login Join Today Products BackProducts Gigs Live Careers Vendor Services Groups Website Testing Store Headlines Experts Exchange > Questions > how to use &nbps;
ejecuto este http://javaspain.yahoogroups.narkive.com/NzRHcnGh/error-500-no-selector-attribute-cookie-header-name-parameter-was-specified action haciendouna llamada desde un formulario. No se que hago mal, espero que mepuedan http://doc.scrapy.org/en/latest/topics/request-response.html ayudarpublic class ComprobarUsuarioAction extends Action {public ActionForward execute(ActionMapping mapping,ActionForm form,HttpServletRequest request,HttpServletResponse response)throws Exception {ActionForward error 500 forward = null;forward= new ActionForward();// return valueDynaActionForm daf = (DynaActionForm) form;String usuario = (String) daf.get("usuario");String pass = (String) daf.get("password");if (usuario.equals("") || usuario == null || pass.equals("") || pass == error 500 no null){return mapping.findForward("noAutorizado");}//validamos de un fichero de textoString ruta =this.getServlet().getServletContext().getRealPath("/usuarios.jsp");FileInputStream fis = new FileInputStream(ruta);if (fis == null){forward = mapping.findForward("noAutorizado");}DataInputStream dis = new DataInputStream(fis);boolean enc=false;while(dis.available() >= 0 && !enc){String userLeido = dis.readLine();String passLeido = dis.readLine();if (userLeido.equals(usuario) && passLeido.equals(pass)){enc=true;forward = mapping.findForward("autorizado");UsuarioBean user = new UsuarioBean (usuario,pass);//lo subimos al contexto//this.getServlet().getServletContext().setAttribute("user",user);request.setAttribute("user",user);}}return (forward);}la pagina jsp que se encarga de recoger el bean es asi:<%@ taglib uri="/WEB-INF/struts-html.tld" prefix="html" %><%@ taglib uri="/WEB-INF/struts-bean.tld" prefix="bean" %><%@ taglib uri="/WEB-INF/struts-logic.tld" prefix="logic" %>
additional data to callback functions Using errbacks to catch exceptions in request processing Request.meta special keys bindaddress download_timeout Request subclasses FormRequest objects Request usage examples Using FormRequest to send data via HTTP POST Using FormRequest.from_response() to simulate a user login Response objects Response subclasses TextResponse objects HtmlResponse objects XmlResponse objects Link Extractors Settings Exceptions Built-in services Logging Stats Collection Sending e-mail Telnet Console Web Service Solving specific problems Frequently Asked Questions Debugging Spiders Spiders Contracts Common Practices Broad Crawls Using Firefox for scraping Using Firebug for scraping Debugging memory leaks Downloading and processing files and images Ubuntu packages Deploying Spiders AutoThrottle extension Benchmarking Jobs: pausing and resuming crawls Extending Scrapy Architecture overview Downloader Middleware Spider Middleware Extensions Core API Signals Item Exporters All the rest Release notes Contributing to Scrapy Versioning and API Stability Scrapy Docs » Requests and Responses Edit on GitHub Requests and Responses¶ Scrapy uses Request and Response objects for crawling web sites. Typically, Request objects are generated in the spiders and pass across the system until they reach the Downloader, which executes the request and returns a Response object which travels back to the spider that issued the request. Both Request and Response classes have subclasses which add functionality not required in the base classes. These are described below in Request subclasses and Response subclasses. Request objects¶ class scrapy.http.Request(url[, callback, method='GET', headers, body, co