1996 National Research and Development Conference
Permanent URI for this collection
Industry Meets Academia: Proceedings of the 1996 National Research and Development Conference, The South African Institute of Computer Scientists and Information Technologists, Interaction Conference Centre, University of Natal, Durban, 26 & 27 September, hosted by The Department of Computer Science and Information Systems, University of Natal, Pietermaritzburg, edited by Vevek Ram, (ISBN 0-620-20568-7)
Browse
Recent Submissions
Item The relational organisation model(1996) Laauwen, B; Ram, VevekIt has become clear that today's complex society of specialised individuals has a need for problem solving techniques using an integrated and interdisciplinary approach. The popularity of business re-engineering beliefs is abundant. The actual successes of large scale business re-engineering projects are considerably less. Why, if everybody believes in it, is it not practised? In this study I will attempt to bring convincing arguments that a way out of the organisational impasse can be found by using the principles of the relational database techniques for organisational modelling. I will highlight some of the reasons for the popularity of relational data structures in computer applications as opposed to hierarchical data structures. I will subsequently look for similar patterns in hierarchical organisational structures. I will then apply the principles of relational data modelling to organisational structures and discuss the benefits of the resulting organisational model. I will make recommendations to contribute to the solution of the problem of 'waste through organisational malfunctioning' by using this model.Item Information security management: the second generation(1996) Von Solms, R; Ram, VevekInformation security has moved a long way from the early days when physical security, together with a set of backups, fonned the backbone of a company's security controls. Today, infonnation security is all about policies, standards, awareness programs, security strategies, etc. The aim of information security management efforts is to enhance confidence in the effectiveness of the information services within an organization. Unfortunately, this confidence is restricted to the organization itself and can only, with great effort, be passed on to external parties. Today, business partners need to link their computer systems for business reasons, but first want to receive some sort of proof that the other partner has got an adequate level of information security in place. A security evaluation and certification scheme that can instill confidence and assurance, regarding information security ·status, to external business parties will solve a lot of problems for the commercial world. 'This approach to Information Security Management, to proof adequate information security to external parties, is termed in this paper as; The Second Generation of Information Security Management.Item Modular neural networks subroutines for knowledge extraction(1996) Vahed, A; Cloete, I; Ram, VevekCurrent research in modular neural networks (MNNs) have essentially two aims; to model systematic methods for constructing neural networks of high complexity and secondly, to provide building blocks for hybrid symbolic- and connectionist knowledge based implementations. The principal benefit of MNNs is that it combines the desirable features of different neural network architectures while compensating for their individual weaknesses. This paper reviews several models of modular neural networks and describes a method for constructing modular neural network subroutines that facilitate easier knowledge extraction. We explore this feature and further consider the generalization abilities of network subroutines as compared with conventional neural network architectures.Item The feasibility problem in the simplex algorithm(1996) Scott, TG; Hattingh, JM; Steyn, T; Ram, VevekThe simplex method is one way of solving a linear programming problem (LP-problem). The simplex method needs a basic feasible solution to start the solution process. This requires an available non-singular basis. Normally it is difficult to identify a non-sin!,rular basis from the problem data. Thus the simplex method introduces the concept of artificial variables that provides a non-singular basis. An artificial objective function is also introduced in such a way that the cost coefficients of the artificial variables are positive (negative), while the cost coefficients of the other variables are zero. By minimizing (maximizing) this new objective function, with the simplex algorithm, a feasible solution for the original problem is obtained. This is also known as the phase one problem of the simplex algorithm. Our paper will concentrate on the aspect of finding pivot selection heuristics for the phase one problem with the objective to improve the performance of the algorithm. If no artificial variables are basic, it means that an optimum solution for phase one is found. In our approach we show how a feasible solution can be obtained by using an evaluation function. This evaluation function guides the pivot process of the simplex method in such a way that it will, as a first priority, try to pivot artificial variables out of the basis. We show how this evaluation function can be used in conjunction with other well-known column selection heuristics. We also show how this approach can be used with multiple pricing, a technique that strives to minimize data transformations at each step. The evaluation function is also employed to improve numerical stability in the solution process. Experimental work on some problems from the NETLIB test suite is presented. This empirical work also compares this approach to well-known column selection heuristics.Item A methodology for integrating legacy systems with the client/server environment(1996) Redelinghuys, M; Steenkamp, AL; Ram, VevekThe research is conducted in the area of software methodologies with the emphasis on the integration of legacy systems with client/server environments. The investigation starts with identifying the characteristics of legacy systems in order to determine the features and technical characteristics required of the generic integration methodology. A number of methodologies are evaluated with respect to their features and technical characteristics in order to derive a synthesis for a generic methodology. This revised spiral model (Boehm, 1986; Du Plessis & Van der Walt, 1992) is customised to arrive at a software process model which provides a framework for the integration of legacy systems with client/server environments.